Page 2 of 2 FirstFirst 12
Results 17 to 19 of 19

Thread: MIT AlterEgo wearable listens to internal verbalisations

  1. #17
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: MIT AlterEgo wearable listens to internal verbalisations


  2. Received thanks from:

    johnroe (10-04-2018)

  3. #18
    Senior Member
    Join Date
    Nov 2016
    Posts
    282
    Thanks
    8
    Thanked
    19 times in 17 posts

    Re: MIT AlterEgo wearable listens to internal verbalisations

    I swear I can't distinguish sci fi from reality any more. There's an overlap. I think the wearable one could only pick up internal 'in the head' verbalisations. They do have those caps that can pick up mental activity. It seems they've already tried the Neuralink idea on mice. Here's a vaguely informative clip.


  4. #19
    Registered User
    Join Date
    Jul 2018
    Posts
    3
    Thanks
    0
    Thanked
    0 times in 0 posts

    Re: MIT AlterEgo wearable listens to internal verbalisations

    We present a wearable interface that allows a user to silently converse with a computing device without any voice or any discernible movements - thereby enabling the user to communicate with devices, AI assistants, applications or other people in a silent, concealed and seamless manner. A user's intention to speak and internal speech is characterized by neuromuscular signals in internal speech articulators that are captured by the AlterEgo system to reconstruct this speech. We use this to facilitate a natural language user interface, where users can silently communicate in natural language and receive aural output (e.g - bone conduction headphones), thereby enabling a discreet, bi-directional interface with a computing device, and providing a seamless form of intelligence augmentation. The paper describes the architecture, design, implementation and operation of the entire system. We demonstrate robustness of the system through user studies and report 92% median word accuracy levels.

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •