Skip to content

Commit 7fd9dbf

Browse files
committed
More text
1 parent e47507a commit 7fd9dbf

File tree

2 files changed

+45
-11
lines changed

2 files changed

+45
-11
lines changed

README.md

+31-11
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,9 @@ of a wide range of emotional expressions and facial gestures.
88

99
The scripts are written in OpenCog "atomese", with the intent that this
1010
enables integration with high-level cognitive, emotional and natural
11-
language-processing functions.
11+
language-processing functions. The scripts are in active development;
12+
new designs and design proposals are actively debated on the mailing
13+
list.
1214

1315
The robot emulator is a Blender animation rig. It implements a dozen
1416
facial expressions, another dozen gestures, such as blinking and
@@ -42,18 +44,23 @@ The this time, the code here integrates three subsystems:
4244
visible in the room. These faces are localized in 3D space, and
4345
issued a numeric ID.
4446

47+
(This needs to be replaced by a (much) better visual system.)
48+
4549
* A collection of "behavior tree" scripts that react to people entering
4650
and leaving the room. The scripts attempt to interact with the
4751
people who are visible, by displaying assorted facial expressions.
4852

53+
(This needs to be replaced by a library of selections, as described
54+
in [README-affects.md](README-affects.md).
55+
4956
* A representation model of the robot self and its surroundings (namely,
5057
the human faces visible in the room). The goal of this model is
5158
two-fold:
5259

53-
** Allow the robot to be self-aware, and engage in natural language
60+
** Allow the robot to be self-aware, and engage in natural language
5461
dialog about what it is doing.
5562

56-
** Enable an "action orchestrater" to manage behaviors coming from
63+
** Enable an "action orchestrater" to manage behaviors coming from
5764
multiple sources.
5865

5966
Some things it currently doesn't do, but should:
@@ -66,20 +73,33 @@ Some things it currently doesn't do, but should:
6673
alpha stages.
6774

6875
* Integrate superior face-tracking and face recognition tools.
69-
Right now, the face tracker eats too much CPU, and is completely
70-
unable to recognize known faces.
76+
Right now, the face tracker is completely unable to recognize known
77+
faces.
7178

72-
* Have a GUI tools for editing behavior trees. The XXX tool has been
73-
suggested as such a tool.
79+
* Have a GUI tools for editing behavior trees. This could be
80+
accomplised by using the
81+
[behavior3js](http://behavior3js.guineashots.com/) tool.
7482

75-
* Integration with OpenPsi behavior system.
83+
* Integration with OpenPsi behavior system. However, see also the
84+
[affects proposal](README-affects.md), which is almost(?) more
85+
important(?)
7686

77-
* Enable a memory, via the OpenCog AtomSpace database. The goal here
87+
* Enable memory, via the OpenCog AtomSpace database. The goal here
7888
is to remember people and conversations and feelings, between
79-
power-offs and restarts.
89+
power-offs and restarts. This requires changes to this repo,
90+
and also writing tools and utilities to simplify the SQL and/or
91+
file-dump management.
8092

8193
* Additional sensory systems and sensory inputs. A perception
82-
synthesizer to coordinate all sensory input.
94+
synthesizer to coordinate all sensory input. High priority:
95+
96+
++ Audio power envelope, fundamental frequency (of voice),
97+
rising/falling tone. Background audio power. Length of silent
98+
pauses. Detection of applause, laughter, load background
99+
speech, loug bangs.
100+
101+
++ Video-chaos: is there lots of random motion in the visual field,
102+
or are things visually settled?
83103

84104
* Have a much more sophisticated model of the world around it,
85105
including the humans in it. It should also have better model

src/behavior.scm

+14
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,20 @@
1616
; Pause the main loop:
1717
; (behavior-tree-halt)
1818
;
19+
; TODO:
20+
; -----
21+
; XXX This needs a major redesign, to NOT use behavior trees at the top
22+
; level, but instead to provide a library of suitable actions that can
23+
; be searched over, and then performed when a given situation applies.
24+
; That is, given a certain state vector (typically, a subset of the
25+
; current state), the library is searched to see if there is a behavior
26+
; sequence that can be applied to this situation. If there is no such
27+
; explicit match, then the fuzzy matcher should be employed to find
28+
; something that is at least close. If nothing close is found, then
29+
; either the concept blending code, or a hack of the MOSES knob-turning
30+
; and genetic cross-over code should be used to create new quasi-random
31+
; performance sequences from a bag of likely matches.
32+
;
1933
; Unit testing:
2034
; -------------
2135
; The various predicates below can be manually unit tested by manually

0 commit comments

Comments
 (0)