Pilot Logistics Logic Engine (PLLE)
What is Pilot?
Pilot is the AGI software implementation that is a central component in the DSS products Pilot and Pilot Space.
Pilot is a generic automation and reasoning agent to "take control over" and "receive input from" any type of Scene. We call this embodiment and it applies to all kind of devices where a closed loop of interaction can be created between the Pilot control software and the environment to which it connects.
Pilot is designed to drive automation of desktop processes, animate virtual synthetic person avatars in VR/AR as well as take control over any kind of machinery or industrial or data processes that is currently exploited under human control.
Pilot operates only in combination with a human instructor or in combination with a then integrated SNN of type PSNM that takes/extends the role of the human operator.
What is logistics?
Logistics will make most people think of the transport sector. Ships and trains and trucks moving raw material and components and finished products between factories, buyers, sellers, producers, consumers and warehouses.
Logistics is all of that and more. It is not only about transporting items but also about making sure that required amounts/quantities of items remain present at pre-defined places.
To do this logistics need processes that:
- create items
- consume items
- convert items (consume item to create items)
- store items (e.g. storage for cooling fruit)
- transport items (where the transport itself is also a storage container)
Pilot internally uses a logistics engine to make sense of the world. That means that commands (both programmed and vocal issued) are translated into constraints that see the world as a logistic process. So Pilot sees the world as a collection of objects (what item) and events (what happens to items) that have a known time and place. Here time is also episodic so the system does not only track the current time and place of items but also keeps track of were these were in the paste and were they will (need to) be in the future.
"Logistics Logic" extends and uses following ground principles:
1. Object Oriented
- Items have attributes (data fields, e.g apple.color= green).
- Items are of a known type (e.g. apple = type fruit, apple = type growsOnTree, fruit = type food)
2. Heuristics
- shortcut (how to get from situation 1 to situation 2) are remembered/learned
- have a known start (input) and end (output) situation
- later solutions first try to solve (part of) the needed change in situation by applying one or more heuristics
- then the missing parts are filled in by finding the optimal path by enumeration of possible steps to bridge the given situations
3. Enumeration
- make a list of all possible moves to bridge situation and select the most optimal of the found routes
4. Resource management
- calculates the cost for a possible step
- cost are resource consumption but also time needed and cost needed (resources often translate to expenses so currency is also a resource)
- availability of resources determine which action is possible at which moment and what needs to be done before and after to create situation in which the desired actions are possible.
What is logic?
Logic is among programmers known:
- as Boolean logic, being operations with questions that have only YES/NO answers
- as fuzzy logic, being operations with questions that have answers that deal with probability expressed as numeric value between -1 and +1 (-100% sure to 0% sure to +100% sure)
PLLE extends on this by:
- Allowing Boolean operations on lists.
- Implementation of Temporal logic by adding extra Boolean operators SEQ,ON and OFF
- Implement logistics into logic bringing the awareness of time and place within the realm of boolean operators and conditions.
Why is it an engine?
An engine is a device that powers/drives/animates a given construct.
- Mechanical engines empower and make mechanical constructs move.
- Software engines empower and make software constructs move.
The PLLE is called an engine in that it is the central software component that brings "actions and animation" and "motivation to do something " to the Pilot program.
This engine implements (among others) following main tasks and purposes:
1. NLP conversion of native human speech to internal logistic presentation and vice-versa conversion of internal presentation into human speech.
2. Communicate (by using IGOR) machine <-> human to practical resolve ambiguities and acquire missing insights.
3. Combine logic with logistics. Logistics is the internal format into which all information of the outside world is translated.
4. Enumerate and compute alternative series of steps that reach the same end situation from the same start situation and decide which of these alternatives can be started (concurrently).
5. Operate within known resource limitations and optimize resource usage via the resource manager that is part of the logistics module.
What is it used for?
Logistics Logic is an attempt (I use that word until it is deployed) to capture our world in as little complexity as possible. This in no way is sufficient to have a system that can properly reason about things but it is sufficient to have a system that can rule out impossible things. Also all is understood in this internal format that thinks of all things having a time and a place and being input and output of processes. For LL (Logistics Logic) transport is merely the gradual change of place/location over time/period. So transport is a function that knows a route that can be used to calculate the current position (and speed) for any item that is being transported. Usually it is such that it is actually the container objects that is being transported and all the items in it derive their position from being the relative position inside that transport. This is also linked to the hinge-tree class which also is involved in this (were all items have a location that is relative to their parent item which can be the road for a car or an elevator for a person or simple a couch for a person watching TV.
The Pilot is an AGI system that runs the synthetic support system of AIR1 (pronounced R-one). So Pilot and the PLLE is like a very smart PLC (think of electronics for automation of factory processes) that:
1) controls the body of the robot
2) automates any data processing stream and process
Purpose is that we humans only need and have this automation of motor (body/limb movement) processes while AIR1 does not only automate (program) its body but also its computer native capabilities. Because after all . . . it is still a computer so it would be crazy to NOT implement all the things that a computer already can do today giving the synthetic mind access to computer abilities at the speed and data-format in which these normally operate.
In neurobiological terms one could call the PSNM the neocortex (conscious reasoning) and the PLLE the cerebellum (automation) allowing, like in humans, to perform tasks without having to focus (rethink) on every detail of them. This software construct is modelled after its bioware equivalent although many operations are executed not by means of neurons due to current generation computer hardware restraints.
The fused being that results from this is able to conscious push a problem to the LLE that can in a few milliseconds enumerate the millions of possible solutions methods and return to the conscious part those that look most promising giving the resources and situation available. Our conscious human brain/mind without LLE assistance will settle on evaluating max a handful possible scenarios and then already settle on the most promising.
Integration
PLLE is one of the modules that make up the Pilot and Pilot Space software program. One could almost say that PLLE is the actual Pilot program as it is the main central module that directs much of the information flows inside the program. PLLE is used to make sense out of the real world and central in how Pilot implements NLP (natural language processing)
Here are some of the major modules that PLLE works together with inside Pilot.
1) SVAD Synthetic Vision And Drawing
- recognition of visual patterns
- creation of unique qualia per unique encountered visual (sub)pattern
- decode and encode, visual shape to/from data structure
2) SHAS Synthetic Hearing And Speaking
- recognition of audio patterns
- creation of unique qualia per unique encountered audio (sub)pattern
- decode and encode, audio sound to/from data structure
3) IGOR Interactive Graph Object Render
- display tree structure of hierarchy of visual and audio qualia
- display label (name) of each qualia
- display/play visual/audio pattern that belongs to this qualia
4) Console, user interactive dialog
- text and speech input and output of user commands and system responses
- use text (qualia names) in this conversation
- use rendering to display qualia for which no name is known yet
- use scanning to learn qualia for which the name was known but no data pattern was stored
- convert English user instructions into internal AA format (Actor-Action format) by listening and reading
- convert AA format response data-packets into English by speaking and printing
5) PLLE Pilot Logistics Logic Engine
- used to parse incoming stories (reading text, hearing story, seeing video)
- convert stories into scenarios (identify actors and their actions)
- convert (part of) scenario into script (take role of one of the actor and perform that scenario)
- used to schedule scripts
- used to monitor demanded stock quantities (that trigger actions)
- used to schedule processes (that consume/use/produce items)
- used to schedule transport of items