×

Jordan Etem's video: Jordan Etem: Givers Takers Government Building Smart Benevolent Governance

@Jordan Etem: Givers, Takers, Government. 🧭☀️Building Smart Benevolent Governance.
Reasoning, problem solving Early researchers developed algorithms that imitated step-by-step reasoning that humans use when they solve puzzles or make logical deductions.[94] By the late 1980s and 1990s, AI research had developed methods for dealing with uncertain or incomplete information, employing concepts from probability and economics.[95] These algorithms proved to be insufficient for solving large reasoning problems because they experienced a "combinatorial explosion": they became exponentially slower as the problems grew larger.[76] Even humans rarely use the step-by-step deduction that early AI research could model. They solve most of their problems using fast, intuitive judgments.[96] Knowledge representation An ontology represents knowledge as a set of concepts within a domain and the relationships between those concepts. Main articles: Knowledge representation and Commonsense knowledge Knowledge representation[97] and knowledge engineering[98] are central to classical AI research. Some "expert systems" attempt to gather explicit knowledge possessed by experts in some narrow domain. In addition, some projects attempt to gather the "commonsense knowledge" known to the average person into a database containing extensive knowledge about the world. Among the things a comprehensive commonsense knowledge base would contain are: objects, properties, categories and relations between objects;[99] situations, events, states and time;[100] causes and effects;[101] knowledge about knowledge (what we know about what other people know);[102] and many other, less well researched domains. A representation of "what exists" is an ontology: the set of objects, relations, concepts, and properties formally described so that software agents can interpret them. The semantics of these are captured as description logic concepts, roles, and individuals, and typically implemented as classes, properties, and individuals in the Web Ontology Language.[103] The most general ontologies are called upper ontologies, which attempt to provide a foundation for all other knowledge[104] by acting as mediators between domain ontologies that cover specific knowledge about a particular knowledge domain (field of interest or area of concern). Such formal knowledge representations can be used in content-based indexing and retrieval,[105] scene interpretation,[106] clinical decision support,[107] knowledge discovery (mining "interesting" and actionable inferences from large databases),[108] and other areas.[109] Among the most difficult problems in knowledge representation are: Default reasoning and the qualification problem Many of the things people know take the form of "working assumptions". For example, if a bird comes up in conversation, people typically picture a fist-sized animal that sings and flies. None of these things are true about all birds. John McCarthy identified this problem in 1969[110] as the qualification problem: for any commonsense rule that AI researchers care to represent, there tend to be a huge number of exceptions. Almost nothing is simply true or false in the way that abstract logic requires. AI research has explored a number of solutions to this problem.[111] Breadth of commonsense knowledge The number of atomic facts that the average person knows is very large. Research projects that attempt to build a complete knowledge base of commonsense knowledge (e.g., Cyc) require enormous amounts of laborious ontological engineering—they must be built, by hand, one complicated concept at a time.[112] Subsymbolic form of some commonsense knowledge Much of what people know is not represented as "facts" or "statements" that they could express verbally. For example, a chess master will avoid a particular chess position because it "feels too exposed"[113] or an art critic can take one look at a statue and realize that it is a fake.[114] These are non-conscious and sub-symbolic intuitions or tendencies in the human brain.[115] Knowledge like this informs, supports and provides a context for symbolic, conscious knowledge. As with the related problem of sub-symbolic reasoning, it is hoped that situated AI, computational intelligence, or statistical AI will provide ways to represent this knowledge.[115] Planning A hierarchical control system is a form of control system in which a set of devices and governing software is arranged in a hierarchy. Main article: Automated planning and scheduling Intelligent agents must be able to set goals and achieve them.[116] They need a way to visualize the future—a representation of the state of the world and be able to make predictions about how their actions will change it—and be able to make choices that maximize the utility (or "value") of available choices.[117]

0

0
Jordan Etem
Subscribers
1.8K
Total Post
814
Total Views
305
Avg. Views
4.6
View Profile
This video was published on 2020-11-30 06:18:00 GMT by @Jordan-Etem on Youtube. Jordan Etem has total 1.8K subscribers on Youtube and has a total of 814 video.This video has received 0 Likes which are lower than the average likes that Jordan Etem gets . @Jordan-Etem receives an average views of 4.6 per video on Youtube.This video has received 0 comments which are lower than the average comments that Jordan Etem gets . Overall the views for this video was lower than the average for the profile.

Other post by @Jordan Etem