Category Archives: Brain in a jar

Smart Clothes

ZDNet reports that Infineon Technologies is weaving sensors, processors, and supporting systems into fabrics. They see near-term applications in entertainment, communications, health care and security. I like that they say: “The further evolution of our information society will make everyday electronic applications ever more invisible and natural”.

The Next Generation

Washingtonpost.com has a story about what biotechnology means to being post-human. While the article gets a little dorky at times, and the comic-book references somewhat over-the-top, it manages to penetrate well past the surface of what most articles would do. (And come on, admit it, how many of us have daydreamed well into our twenties about doing the kinds of things only comic book heros can do?) They reference a lot of good material, talk to John Kurzweil and Max Moore, and use the excellent Science Magazine issue on this subject for a lot of their material.

The Etherface

Etherface, if you’ll let me coin the term, will refer below to the inevitable pervasive network interface that will link physical things and the information that describes and controls them. There will be a tangible world, and a corresponding representation of that world, that will be rich with detail and control. The etherface will enable interaction with our world at a much deeper level; a level where the tangible is supplemented. Analogous to hyperlinks supplementing texts, tangible objects in our familiar physical world will be linked with information and controls. Other virtual environments will also develop for personal and thematic use.

There will be many paths in the development, including:

  • The transition of computing systems to capture the strengths of centralized, peer-to-peer, and personal computing.

    There is a transition toward centralized and peer-to-peer systems-away from personal computers-but there is a role for all three, based on the natural strengths of each. Personal computers are well suited for local storage, personal applications, and private servers. Centralized and P2P computing can better take advantage of commercial and shared applications and storage.

  • The evolution of portable digital assistants (PDAs) into wearable, even sub-dermal, sensory, productivity, and interactivity tools.

    The opportunity to support the existing senses with supporting devices offers people the ability to enhance the human condition in a wide variety of ways, and the ability to compute with mobility frees us from many traditional location-based confines.

  • The extension of networks into a broader scope of devices.

    As 802.11, Bluetooth, RFID, or other wireless networking systems are supported in a broader array of appliances, the usefulness of the network increases. Definition of an interface standard for the broad introduction of compatible devices will be an important milestone for increased adoption.

  • The development of intuitive interfaces, taking advantage of more of our senses and biology.

    Interfaces will come to replicate the natural interface through which we experience reality, namely, our senses and biology. Visual and auditory interfaces are advanced further than others, but still have a long way to go. Voice interaction, holographic video, biofeedback, and dynamically self-enhancing affective computing techniques are some stepping-stones toward truly intuitive human-computer interaction.

  • The standardization of information management and distribution platforms to support all of the above.

    .NET and SOAP are strong steps in the right direction with regard to interoperability and integration standards for information applications. Further, systems like CPAN make available a very large and constantly growing code base that is public and documented. XML extends HTML for the definitions of unlimited tag definitions, and new standards are developing for the sharing of data. XSLT transfers allow for the separation of the content and the presentation with almost unlimited extensibility. These trends will continue, representing an evolving standard for the management and distribution of information.

  • The development of machine understanding standards and tools.

    Objects in the physical world share characteristics with other objects in the same class. This type of inherited descriptive understanding enables us to understand deeply and draw analogies based on commonalities. It may even be the case that creativity is based in part on the exploration of indirect analogies. Similarly, inheritance and object-oriented design is growing as an important heuristic for information management and distribution. Inheritance plays a key role in the functionality of a code base, and similarly allows for deeper machine understanding of virtual objects.

The etherface is key to enlightenment and the extension of the human life span. Eventually, the etherface will allow individuals to live with any level of relationship to the tangible world that we choose. This means that one could choose to use the etherface to set a timer for coffee in the morning, or to overwhelm the physical world with a virtual environment where one controls the rules and the access. Enhancements to our sensory, memory, communication, and processing systems increase our opportunities for experience, reflection, sharing, and learning.

Mental health will grow in importance relative to physical health over time as better tools reduce demands on our physiques. The etherface is one of these tools, enabling productivity and interaction while reducing the risk that failing physical health will end life. Taken further over time, we could extend life spans and quality of life to a degree that we do not yet understand or have any experience with.

Method of server-side biofeedback system for mechanically evolving human-computer interfaces

Many different forms of biofeedback devices, each with it’s own software, could be attached to a USB port on any networked device that supports USB. The output from the devices are sent real time in raw (compressed and encrypted) form to a server side application, probably written in C++. The installation of the new device involves:

  1. acquiring the biofeedback device,
  2. connecting it to the network device,
    identifying the muscle groups, electrode locations, or any other biofeedback characteristics,

  3. initiating the translational learning, and
  4. interacting in regular teaching sessions within an individual account.

The first signal from the biofeedback device to the server is it’s identification. An installation file is selected and executed on the basis of this identification. The installation file is run on the server side to set up the new device and initiate the translation learning. Translation learning begins as the translation software (Referred to as “Empath”), requests that the user perform certain tasks, concurrently storing the streaming output of the biofeedback device. A new database table is generated for each device installed within your account to your Empath. Correlations are evaluated between biofeedback streams and the tasks being performed in order to identify the appropriate computer response to realtime biofeedback streams.

Each user teaches the application to respond to biofeedback by providing the computer with adequate samples of data. When there is any transalation that does not meet an acceptable standard of certainty, the user is prompted for additional samples that specify the correct translation; this acts to teach the Empath. The teaching system can be disabled, and the certainty standards can be edited.

Device manufacturer would provide a biofeedback hardware device that would stream realtime to a USB plug, and a very small definition file (probably an XML file consisting of identification, and translation variables… possibly specifying the tasks to be requested in initial teaching sessions, as well as other fields that can grow into a public standard interface protocol). From the user account, a list of supported devices could be made available to display the definition files available.

This method would provide an interface between the human and the computer in which the computer and the human communicate through any type of USB biofeedback device. These devices would connect by UBS cable to a network device which is in turn communicating with a server (Probably by thin client streaming software that takes USB input and sets up VPN, compression, and encryption).

As many users teach their Empath to translate their biofeedback, aggregate information will be extracted for increasing the rate of learning for each user. Correlations across very large populations will help to initiate the teaching process for new individuals; this way the Empath can use the rest of the population as a starting place from which to earn the new individually optimized translation.

Implications for computing: computers will be learn to understand your gestures if you can stream your gestures into the USB port. Server side “ASP” processing allows for very thin client applications and extensible device support. Interface standard publication enables mass market public and commercial development of biofeedback devices with USB output.

Implications for humans: We turn the corner such that computers learn to understand what humans mean, rather than humans being forced to learn new (and highly limited) communications skills, like typing.

The Changing Face of Evolution

Genetic codes in our cells provide the system upon which information is stored and algorithms are performed to determine our perceptions. Similarly, software codes provide the system upon which information is stored and algorithms are performed, effectively doing the same thing. Major industries will focus on the processes of evolving these systems (and the interface that enables communications between them). It is purely our life, and perception of it, that defines our demands, and so these two areas of business will form the dominant industries of the next century.

Computer interfaces will become natural extensions of our senses, integrated with device controls that allow us to interact with our environment and each other using and broadcasting information. This will increase the effectiveness, efficiency, and diversity of communication as well as giving us control over network devices, systems, and resources.