Look, Learn, Ask, Try: Geran, Kyle & Terrence

This one was a little weird for us because our application spans across several paradigms so it was hard to scope exactly who we were going to study and ask questions about.  However Brint Carlson, a personal friend of mine, is currently a tank engineer in ROTC here at ASU and since he was an easy point of contact I went ahead and brought the idea up to him.

Look: I observed him doing some training out in the desert, this time in particular it was medical.  I was granted permission since it was a school project however I was not allowed to take any photos or video because it’s the military so they weren’t super stoked about that.  However the exercise was to discover fellow soldiers in need, asses their issue and respond accordingly.

Ask: I asked them what typical things happen ‘in the field’ and how does one go about helping them?  They responded that most often there are a team of medics and they are usually the only ones qualified to handle said situations.  Not many people are equipped with the tools or expertise to asses a soldier in need.

Learn: I learned that one of the biggest problems is communication and that it is hard to find the people in need or when they do locate them, they realize they didn’t have the necessary equipment for the task at hand and thus they need to go back to camp and grab the gear which can sometimes take too long, resulting in a fatality. We couldn’t exactly ‘try’ this activity of course but we can speculate that if we have some kind of equipment to alert local medics about the condition of a soldier it could save a lot of time locating them and bringing the correct equipment for the job.

Try: As far as ‘trying’ goes, I have started to implement a machine learning aspect to the eeg helmet that can try an categorize different conditions of the brain.  eMotiv actually has a large database that you can subscribe to that will show you common data structures for different things like seizures, massive sudden pain trauma, elevated endorphins etc.  So at some point I would like to train some data sets and see if I can detect matches to different scenarios and situations.  Of course I don’t want to fake a seizure or something so the scenarios will have to be more mild like being happy or sad or frightened.

 

Status Update: Hazardous and Safety Conditions Wearable. Geran, Terrence & Kyle.

So far we have implemented the more hands-on portion of our project by testing out both the Google glass and the Epoc+ helmet from eMotiv.  It is taking a bit to get the two to jive as far as the data but we are able to harness visual data and eeg data simultaneously which was basically the end goal for the proof of concept aspect of the project.

 

Below is a simultaneous reading of my eeg signals while wearing the Google glass to display data and record video.  This displays all the necessary eeg waves while recording the environment.

 

In the coming days if we have the chance I would like to pipe the eeg data into a max patch if it is possible through the serial port and do a small machine learning aspect with wekinator.  However this is not a priority as our speculative design is much more important for the bio-design challenge.

The real next few steps are to design one or more speculative designs that are futuristic in nature and can span across different markets.  The end product wouldn’t be and eeg helmet and a pair of glasses but more of and instertable or a wearable device tailored for the application at hand.  For example we could make small instertables for elderly folk in a nursing home, special military applications for a group on a secret mission or even given out to the population during a natural disaster time.  Each speculative design would have it’s own capabilities and implementation.  The elderly home may be equipped with a heart rate monitor and gps in case the patient has a history of wandering off which happens quite often when elderly people suffer from dementia.  The military application could track pain through eeg signals and alert neighboring forces if someone is in trouble.  The populous wearable could be equipped with whatever is necessary for the disaster at hand be it a wildfire, hurricane or tsunami.

From here on out we will do the following:

Geran- Continue developing Google glass / Epoc helmet to get some consistent results to show a more usable application in the future.  Also help with speculative designs.

Kyle- Develop ideas for different speculative designs with Terrence.  Also organize and prepare our presentation and exactly how we plan to tackle the correct audience for the product.  Narrow down our research and end goals.

Terrence- Develop ideas for different speculative designs and how many different applications we may want to involve ourselves in.  Make some 3D models of potential designs and how/why they would work.

 

 

Project Update: Hazardous and Safety Conditions Wearable. Geran, Terrence & Kyle.

Final Deliverable:

The final deliverable for the class based on the feedback we have received, the resources we have and the timeline that we are working with will mainly consist of research, a digitally produced prototype and video documentation of a mock up system using the tools we have.  With the eMotiv and Google Glasses (if we can incorporate them) and maybe a few other sensors we can demonstrate a proof of concept while still projecting our idea into the future as a more reasonable and less intrusive wearable. We would digitally design a device that serves our intended purpose, but demonstrate how or what we would record and use as far as data with the eMotiv and other devices.  If time allows, we would also begin the AR/VR portion of the proposal as an extension of the mock-up system, but we are not sure if we’ll have the time. A potential ‘mock-up’ mock-up might be in order just to show what that part of the project would look like.

 

Class Feedback:

Likes:

-I like that this helps people.

-I like the concept of helping people during natural disasters.

-I like the idea of wearables for emergency response.

-I like the prior research you have done for this project.  The networking part is strong.

-I like that they have a physical deliverable for a potential VR simulation.  Also having a video to demonstrate is a good idea as well.

-I like that this idea could be implemented in already existing technologies (wearables).

-I like the idea of applying wearable technologies to improve safety and humanitarian efforts.

-I like the concept and implementation.

-I like the overall idea of a better wearable.

-I like the idea of networking and crowdsourcing.

-I like the idea and the technology involved.

-I like the focus on public safety.

-Cool idea, could keep people safe (yup).

-I like the focus on creating a better disaster response system.

-I like the idea of faster first response.

 

General Liked Themes:

-Natural Disasters & First Response / Safety

-Video & Physical Deliverable / Technology Involved

-Networking

 

Dislikes:

-I don’t like how this seems glasses-unfriendly, what about people with glasses.

-I wish they elaborated more on the physical aspect / design concepts.

-I wish you addressed potential privacy data concerns with broadcasting health data to the cloud.

-I with this engaged with broader issues around biotech, what is the vision for biotechnology here?

-I wish they had possible solutions for making it waterproof.

-Someone said something about police being unethical.

-I wish the exact application for the technology was more well-defined.

-I wish you would explain which government systems would use this device.

-I wish the wearable was designed for a more specific group of people.

-I wish I had a better idea of the specific application for your device.

-I wish you mentioned example data or information you would like to use in the system.

-I wish we could see a working prototype.

-I wish the group thought more about how to interconnect users.

-I wish there was a better way to transfer data other than the ‘cloud’ (dont think thats possible?).

 

General Disliked Themes:

 

-Lots of Concern About Data

-More Niche of a Market

-Technology Involved and Relation to Biotech

 

Brainstorming – Geran Pele

I am very much late to turn this in but I figured better late than never!

0312181938_HDR

My ideas include:

Bio-Art / Antibiotics:

Living Paintings

Living Sculptures

Wearable accessories that change on the environment

Color changing soap based on how ‘dirty’ you are

Antibacterial trash bags?

Some kind of art based on your diet

Bioluminescent workspaces

Sustainable Fabrication:

Organic fuel! (cop out)

Organic rooftops

Organic tires

Plant based office supplies and furniture

Integrating plants into infrastructure

Modular biological design (like PhoneBloks)

Organic cables

Biometrics:

Sleep cycle monitor that wakes you up after a cycle is complete

EEG sensing for marketing

Digital muscular sensor for exerted energy

Location based emotional maps

Biometrics to influence political decisions

Piezzo generators on the body

Monitoring for dangerous work & environments

Mycelium Project (for the most part): Geran, Shomit, Will & Damon

Our project inspiration was that of a naturally formed terrarium with built in sensors.  The idea was to use Mycelium as a medium to not only create the biome, but also house the ideal sensors for whatever application you would be using it for (snakes, lizards, exotic pets that require very specific environments etc.).

Unfortunately as many others experienced our mold was, moldy, and thus we could not use our original intended design.  That said, this is a sort of proof of concept or at least a small piece of the project idea demonstrating a particular sensor one might use in a terrarium (UV sensor) with an LCD display showing the current UV Index and Visible Light.  Ideally we would combine this with several other sensors to monitor a habitat.

Arduino Code:

#include “Adafruit_SI1145.h”
#include <wire.h>
#include <SoftwareSerial.h>
#include “Adafruit_SI1145.h”

//LCD object:
SoftwareSerial lcd = SoftwareSerial(2,3);

//UV Sensor object:
Adafruit_SI1145 uv = Adafruit_SI1145();

void setup() {
// put your setup code here, to run once:
lcd.begin(9600);

Serial.println(“Adafruit SI1145 test”);

if (! uv.begin()) {
Serial.println(“Didn’t find Si1145”);
while (1);
}

Serial.println(“OK!”);

lcd.write(0xFE);
lcd.write(0xD0);
lcd.write((uint8_t)255);
lcd.write((uint8_t)0);
lcd.write((uint8_t)0);
delay(10); // give it some time to adjust the backlight!

}

void loop() {

//Read UV index:
float UVindex = uv.readUV();
//Format index correctly:
UVindex /= 100.0;

i++;
//clear screen
lcd.write(0xFE);
lcd.write(0x58);

//Display UV Index:
lcd.print(“UV Index: “);
lcd.print(UVindex);
lcd.println(“”);

//Display Visible Light:
lcd.print(“VL: “);
lcd.print(uv.readVisible());

delay(1000);
}

Arduino Soft Switch – Geran Pele

For this assignment I first designed my circuit in Fritzing (totally didn’t realize a completed circuit was provided, but good practice I suppose!).  I then re-created this circuit using a breadboard just to insure that everything checked out.0116181443_HDR

I then ported this circuit to a cardboard and copper tape version to experiment with different materials.

0116181504_HDR

Then I finally used my body as the switch!  As demonstrated in this very terrible documentation video, it is hard to tell that the lights are switching due to the very bright room but I show that the switch is working in the serial print window showing “high” or “low”.

 

 

const int offLed = 10;
const int onLed = 8;

const int switchPin = 3;

int state = 0;

void setup() {

Serial.begin(9600);

//Set pins for led’s
pinMode(offLed, OUTPUT);
pinMode(onLed, OUTPUT);
//Set input pin for the switch
pinMode(switchPin, INPUT);
}

void loop() {

state = digitalRead(switchPin);

if(state == HIGH){
//When the switch is closed turn on the ‘on led’ and turn off the ‘off led’
digitalWrite(offLed, LOW);
digitalWrite(onLed, HIGH);
Serial.print(“high”);
}else if(state == LOW){
//When the swtich is open, do the opposite
digitalWrite(offLed, HIGH);
digitalWrite(onLed, LOW);
Serial.print(“low”);
}
}