Smartphones But in Thin Air? | Future of Interface Evolution – Medium

Posted: December 8, 2020 at 9:52 pm

without comments

What has pandemic life been like for you? Im curious to know how the experience varies for different people & began researching about that this past weekend.

And somewhere down that internet rabbit hole I began looking into past pandemics and what pandemic life was like for the humans back then.

Back in 1918 during the Spanish Flu, not only were there no effective vaccines or antivirals, drugs that could treat the flu like today but the ability to converse with friends and family all over the world was far far off from possible.

In fact with the lack of communication tools, people were reliant on local community updates through physical printing methods to be kept in the loop about pandemic progress. And if you got the virus there was no way to let anyone know, let alone ask for help.

In fact if you were smart at the time, you would use something like a white scarf, and wrap it around your door handle to let people know that you werent feeling well & they shouldnt enter your room.

But with no way to connect & communicate that made the 24 months more dreadful from isolation and fear of the unknown.

Whereas today, during the coronavirus, the advancement of technology has made the transition & change more comfortable for many people.

With an estimate of over 5 billion people having access to a cell phone/computer which gives us the ability

Were far better off in terms of connection than those during the Spanish Flu.

Our devices have become an external limb we carry them everywhere, are holding them throughout the day, and if we leave the house with nothing else, our phone will at the least be in our pocket.

If we think about this progression it started with computer interfaces. A huge innovation for computers were the mouse and keyboard additions which made using them more intuitive.

We then got rid of the keyboards & began controlling these devices like we do the rest of the objects around us with our hands.

Simple touches that replaced the need for keyboards and mouses, leading us to the touchscreen phase. Its so easy and natural that a child can control these devices with no instructions using just their fingers.

But what now? ..

It seems like a very obvious progression for these hardware devices to now disappear and for us to have the information that can be accessed using them available to us in thin air.

Currently virtual, mixed, augmented reality have been leveraged to try to make this a reality with initiatives like Project Aria by Facebook, however using XR in isolation, doesnt seem like a promising bet for making this ubiquitous.

Lets compare this access to all information from a device in thin air instead to smartphones, becoming ubiquitous.

According to the founder of Neurable there were 3 main stages to the iPhone becoming ubiquitous.

Category 1: Niche/Enterprise specialized category

The Palm Pilot phone falls under this, which was specifically for business people to organize their data.

Category 2: Consumer Specialized Category

The Palm Live Drive falls under this, which was a phone with WiFi and touchscreen & all these novel features that the industry hasnt seen yet.

Category 3: Ubiquitous

The iPhone falls under the ubiquitous category. The interesting thing is that the Palm Live Drive which came out years earlier than the iPhone had more features.

Secret: The difference between a consumer specialized product and a ubiquitous product is the interaction being undeniably natural aka the iPhone.

Current AR/VR methods are an unnatural alternative to the iPhone because they lack

*Enter: Brain computer interface magic*

Brain-computer interfaces (BCIs) are systems that allow communication between the brain and various machines and seemingly the next stage in this interface evolution.

An event-related potential (ERP) is the measured brain response that is the direct result of a specific sensory, cognitive, or motor event. It has been used to allow differently-abled individuals to type using their thoughts, hands-free.

The way that this works is that

6. If the patient blinked, the key would move to the right.

7. If the patient didnt blink for three seconds then that key would be clicked on and typed.

8. There were also shortcut keys which let you go back to a specific row, or shuffle between them without having to click through each box individually.

As this progresses & is made to work faster and more intuitively it could allow all of us to type using just our thoughts.

Leveraging such ERP for a variety of conscious want detection & then using advancing ML tech to produce real-time, responsive actions in XR environment have been created like thinking of wanting an orange on a table from the other side of the room and then the orange travelling to you.

This intersection of XR and BCIs could allow us to create an alternative to smartphones which becomes ubiquitous

Three main steps:

The first step is measuring brain signals which can be done with three different approaches.

So, imagine you are living in a different city and want to join your family for their dinner table conversation.

This invasive approach would be like asking all you family members to wear lavalier microphones on their collars while on call with you and you listen to the conversation using air pods so you are able to get clear and crisp information (/audio) of what they are saying.

2. Semi-invasive method: electrodes are placed on the exposed surface of the brain and electrocorticography (or ECoG) data is collected by recording electrical activity from the cerebral cortex of the brain.

This would be like having a smartphone on the table to listen to your familys conversation via WhatsApp audio, you can hear what they are saying but it could be crisper.

3. Non-invasive method: sensors are placed on the scalp to measure the electrical potentials produced by the brain also known as electroencephalogram (or EEG) data.

This is like having your phone in the kitchen and listening to the call while you clean up the living room its harder to make sense of what they are saying, but you could listen carefully to understand it better.

Drawing the parallel between the types of brain signal collection methods & the listening to a call methods shows the accuracy & detail of data that can be collected by each.

The measured brain signals are then run through a software which identifies the different brain signals based on the activity performed.

For example if a theta wave is detected which is when the brainwave has a frequency between 4 to 7 hertz that indicates the individual is sleeping.

Then machine learning is used to activate an output where a machine takes a certain action. The external device is controlled/responds according to how it was programmed to based on the brain signal detected.

Currently the most practical applications of brain computer interfaces have been in the medical field

According to the World Bank, 1 billion, or 15% of the worlds population are differently-abled and must rely on others to help them perform basic tasks like eating, walking, drinking water & bathing.

They lack the privilege of controlling their day to day actions and interacting with other people & technology the way fully-abled individuals can.

The previous example of typing using your thoughts is currently being used by LIS or Locked In Syndrome patients who cannot move any muscles in their body except for blinking their eyes.

Using BCIs researchers from Case Western Reserve University & Harvard Medical School have also been able to restore functional reach-to-grasp ability for a patient who had a severe spinal cord injury and was paralyzed from the shoulders down.

There was another study that allowed paralyzed monkeys to walk.

A company thats currently working on creating a world without limitations where you can be someone whos different abled or fully-abled but can interact or control anything using just your mind is Neurable.

The algorithm systems goal is to understand user intent. So far theyve created a virtual reality device and cap which records electrical signals from brain activity and interprets what you actually want to do from them.

Theyve created a software that allows you to control devices using just your mind in both the real world and the digital world essentially telekinetics.

The way that it works is that when new information is presented to you, your frontal lobe which is in charge of executive function communicates with your parietal lobe which helps with visual spatial processing.

They leverage those two areas of the brain to understand user intent and move external objects accordingly which takes us to the world of brain-machine interfaces where interfaces meet robotics and smart objects

As BCIs progress exponentially & they are being used for purposes other than helping those differently abled, neurological data from more and more people will become available, & we will be confronted with critical ethical questions.

This future isnt as far away as it seems with Ray Kurzweil, used to the Head of Engineering at Google, who thinks that by 2030s were all likely to have brain chips.

For example, the US Military is in clinical trials for a mood altering brain implant which would allow them to control how you feel. The intent is to help soldiers with depression or PTSD feel better. But if you think about it: thats still a third party controlling how you feel

Researchers have also been able to detect if you are laughing, smiling, running or jumping in a dream. And so if we could program these dreams which feel like real experiences to create virtual realities of ones mind that you or other people could step into, then ones desires, secrets, and thoughts could be exposed and taken advantage of.

Today itself we are worried about the data thats being collected on us based on external actions. Skin outward which pictures we like, who we meet, what we eat etc.

But with brain chips predicted to become a norm, third party organizations could have access to whats going on inside us technically knowing us better than we know ourselves.

That relationship is far more dangerous, & if manipulated by economically or politically incentivized organizations with malicious intent dangerous outcome.

A solution proposed by Bryan Johnson, the founder of Kernel is that if we say that human data privacy is a right and

As BCIs continue to rapidly develop future realities brain to brain interfaces could transform our day to day interactions.

There was a study conducted where there was

This is known as brain-to-brain communication & would allow us humans to communicate with each other not by speaking, not by texting, but instead by simply thinking

This could allow us to share our knowledge, experiences, and opinions with each other non-verbally leading us to explore what it means to download knowledge and skill sets.

We spend over 20 years about 1/5th of the human lifetime in educational institutions acquiring knowledge. We learn what already exists, is already available, what others already know.

There was a Harvard study that showed that more students are likely to know where on the internet to find out information on something than the information itself.

What if we could save years by downloading the knowledge and skills we need at the moment & spend more of our time working on questioning it, and applying it opposed to just acquiring it.

There are researchers looking into how our consciousness are all somehow connected shared consciousness. Where we could potentially experience someone elses life by experiencing other peoples experiences in a dreamlike state.

This could potentially allow us to eliminate isolation and allow for true empathy at the same time redefining what it means to be human.

Before we make telepathy, telekinetics, and getting rid of smartphones a reality. I have a quick question for you:

What does it actually mean to be human

Go here to see the original:
Smartphones But in Thin Air? | Future of Interface Evolution - Medium

Related Post

Written by admin |

December 8th, 2020 at 9:52 pm