Richard L. Ballard
Richard L Ballard Ph.D. (born March 22, 1940) is the first to define knowledge as a science rather than as a philosophy. Knowledge science is articulated in such documents as "Physical Theory of Knowledge and Computation",1 a definition of knowledge "Knowledge = Theory + Information", "AXiomatic Definition of Knowledge & Human-like Constraint-based Reasoning, "2 (originally released as "General, Quantitive Theory of Knowledge" 1987-1993 3), Knowledge Engineering Methodology, and development of the world's first theory-based semantic software architecture and knowledge capture product called Mark 34, that simulates concepts, IDeaS, thought patterns and instance models that are constrained by well-justified theory to define their relationships. Ballard has forged a unique career combining teaching, scientific research, practical engineering projects of national importance and business development. He has been a founding member of four industries: computer software, computer-based education, computer-based publishing, and knowledge science and engineering. He has been a noted teacher and leader in all four. Ballard has received 128 software citations, developed 21 Educational Software Workshops, 3 Management Software Workshops and a 10 Week Professional Knowledge Engineering course. He has been published in 35 publications and technical reports. He earned a Doctor of Philosophy, with Distinction in Solid State Physics, UC Berkeley 1970, and holds a NASA Fellowship, Institute for Space Physics, Columbia University 1963.
Primary Principles of Knowledge Science Knowledge science defines a physical theory of knowledge that is consistent across the physical domains of human thought, nature and the physics. Within this science is defined an understanding of human thought, it explicit and tacit expression, and how machines can capture and simulate all human knowledge and reason with that knowledge the way people do.
Theory-based semantics is the discipline of precisely defining the meaning of concepts, ideas and thought patterns by their relationship to other concepts and ideas, based on well-justified theory. Because meaning is represented by concept relationships, theory-based semantic is language independent, expressing the precise meaning held in the mind, rather than what is expressed through language. First proposed by Richard L. Ballard, Ph.D. between 1987-1993, theory-based semantics is the governing principle of theory-based semantic systems which allows machines to reason with the same theories that humans use.
Reference Source: "Fundamental Definitions in Knowledge Science & Engineering" by Dr. Richard L. Ballard, 12/2004, self-published - course book for 1st knowledge engineering course taught at UC, Irvine, California entitled: "Creating Systems That Know." Other supporting sources are Ballard's Physical Theory of Knowledge and Computation (2006). White papers and presentations can be viewed at at KnowledgeFoundations.com.
Language Independence The Knowledge Science of Richard L. Ballard states that theory-based semantic technologies are language independent because their concept, idea and thought patterns are faithful representations of the language of thought that occurs in the human brain, not in language. Language is descriptive and for that reason, highly ambiguous. There can be multiple meanings for words, or multiple words to represent a meaning. Theory-based semantics regards language, and more appropriately, the words of language, as dataforms, being no different to the software than an object such as a "gif", "jpeg" or any other object.
For this reason, Language of thought concept, idea and thought pattern representations, once captured, are easily and cost effectively output to any language.
Thought Patterns Based on the knowledge science of Richard L. Ballard, 1940 - , thought patterns are rational compositions of concepts that are repeatedly used and understood by humans once learned. Examples of thought patterns include hierarchical relationships such as organizational charts or parent/child taxonomies relationships, sets and lists. Broader and more abstract thought patterns might include the rational thought pattern of beginning-middle-end. The beginning-middle-end thought pattern applies to many aspects of human life such as games, music, literature, formal correspondence, periods of history and so forth. Thought patterns are an essential element of the "language of thought" and theory-based semantics.
Knowledge = Theory + Information Knowledge and its descriptive formula "Knowledge = Theory + Information", is a core principle of Knowledge Science (based in part on the work of Carl Sagan and John Sowa), advanced and proposed by Dr. Ballard, between 1987-1993. The premise of Ballard's argument is that knowledge is anything that decreases the uncertainty of questions. This occurs through a "declarative" process involving theory (conditional reasoning power), and information (the facts of Situations and circumstances). When combined through an A priori and/or analytical reasoning process, uncertainty is decreased and replaced with a state of knowingness. Lessons learned from the application of theory to solve questions are then spontaneously applied to other situations and circumstances. Knowledge can be captured from documents, drawings, illustrations, forms, spreadsheets, books, contracts, policies and procedures, reference sources and from the very minds of people through a methodology called Knowledge engineering. Knowledge engineering work products are called editforms. Editforms are feed into theory-based semantic publishing tools that are used to automate complex decision-making or to transfer job knowledge, among other uses.
Theory represents more than 85% of knowledge. Theory is “A priori” (known before the fact). It is the element that constrains the meaning to concepts, ideas and thought patterns, and the conditional reasoning power required to answer our "How", "Why" and "What if" questions. Learned through enculturation, education and life experience, theory shapes our behavior and the way we understand our world. Well-justified theories such as those proven most successful by science, engineering and business, are most valuable. Theory is predictive and considers all possibilities. Once it is learned it is used for decades, centuries and millenniums. Most of the core theories that shape our social behavior, for example, were conceived 20,000 to 40,000 years ago, passed down through the generations. Much of our financial theories, such as "buy low, sell high", or "the principles of interest", were conceived and put into use by our ancestors millenniums ago. Modern theories, such as those underlying the wireless communications, were conceived in the 1940s and put into practice in the 1970s. The facts of circumstances and situations may change rapidly, but the underlying theory that gives them meaning, do not.
Information represents approximately 15% of knowledge content and is the instances of anything that exists in time and space that can be processed by the senses, measured and counted. Information is the facts of circumstances and situations. Information is “A posteriori (known after the fact). It is the "who", "what", "when", "where" and "how much" instances of circumstances and situations. Conventional information technologies are designed to store and transport facts, but these systems require people to use the theory in their brains to understand and apply those facts for useful ends.