Tuesday, November 29, 2016

Get a Job In Silicon Valley by Playing a Coding Game


What if an online coding game could land you a six figure job in California? Well playing CodeFights could do exactly that. James Johnston discovered CodeFights through a Facebook ad and was intrigued so he started to play. After two nights of coding problems, a pop up message appeared asking Johnston if he was interested in getting a new job. He clicked yes and the next day Tigran Sloyan, founder and CEO of CodeFights, called him and talked about potential jobs. Over the next month Johnston had a dozen interviews and landed a job in Silicon Valley. Johnston went from designing software for orthodontists in Chattanooga, Tennessee to working for Thumbtack, a one billion dollar startup located in Silicon Valley. He even got a stake in the business. 

From launching in 2014 CodeFights has registered five hundred thousand users, just in San Francisco. The twenty best players are given the best opportunities for jobs, but their has still been dozens of players who have landed jobs in the past month alone. Petroff quotes CEO Sloyan saying, "about 20% percent of people who are connected with companies secure a new job"(CNN). However, there is a cost for companies who hire programmers through CodeFights. CodeFights charges companies who hire their player 15% of the annual salary that they plan to pay the new employee. Even though this is a high price to pay, many companies are still interested in investing in the top tech talent, which are the players who are best at the coding games. CodeFights helps build individuals' coding ability and offers new talent to companies hiring software programmers.

Resources:








Friday, November 11, 2016

Creating Wireless Virtual Reality

A new cordless virtual reality device consists of two directional

The biggest issue with current virtual reality headsets is they must be connected to computers and systems that allow the headset to project such high-resolution visuals. The headset is connected to computers by an HDMI cable that is annoying to users as they have to maneuver around it and try not to trip. Recently, researchers at MIT's CSAIL department have worked together to develop the MoVR, a system that allows users to use any virtual reality headset wirelessly. The system works by using millimeter waves, which are high frequency radio signals, to connect to the computer and wifi. Millimeter waves are known to possibly be a part of the futures' amazingly fast smartphones.

Wireless virtual reality headsets are more comfortable for users, but they can't access all the advanced data-processing. In order to project the same high-resolution visuals as a vr headset with a cable input, the wireless system needs data rates of more than six Gbps, which cannot be achieved with any system today. MoVR works with mmWaves, which have been used for things like high speed internet and cancer diagnosis. However, the downside of mmWaves is in order for them to work with virtual reality headsets there must always be a connection between the transmitter and receiver. This connection can be blocked very easily by moving anything between the two. The CSAIL team of researchers found a way around this problem by creating MoVR to act as a programmable mirror that can find the mmWaves signal and reflect it back to the receiver. MoVR is programmed to use angles to accurately reflect the mmWaves signals from the transmitter towards the receiver on the headset. MoVR's are able to find the angles through two antennas, called phased arrays, that focus signals into beams which are sent to the MoVR system.

References:
http://news.mit.edu/2016/enabling-wireless-virtual-reality-1114
http://www.techtimes.com/articles/185580/20161112/htc-vive-virtual-reality-headset-goes-wireless-220-upgrade-kit-now-open-for-preorders.htm

Friday, November 4, 2016

Using Computer Science to Detect Childhood Communication Disorders

Image result for Automated screening for childhood communication disorders

Massachusetts General Hospital's Institute of Health has been working with researchers from the Computer Science department at MIT to create a computer system that automatically determines whether or not a child has a speech or language disorder. It's important to diagnose these disorders at a young age so the children can learn to grow out of the disorder by the time their an adolescent. Unfortunately, sixty percent of children go undiagnosed by the time they reach kindergarten. This system works to diagnose speech and language disorders by analyzing children's audio performances on reading a story. The children watch a series of images and narrative about a story and then they need to tell the story back in their own words. To check how accurate the system was researchers had to, "use a standard measure called area under the curve, which describes the tradeoff between exhaustively identifying members of a population who have a particular disorder, and limiting false positives"(Hardesty). The researchers' performed three tests to find its accurate about eighty percent of the time. In medicine, if the system works more than seventy percent of the time it is considered an accurate test.

Two graduates of MIT, John Guttag and Jen Gong, believed that pauses in children's speech, when they try to complete sentence or remember a word, are sources that help diagnose communication disorders. So they implemented thirteen acoustic features of children's speech into their system to be recognized. Their system recognizes certain patterns of pauses and error in speech that correlate to the communication disorders it can diagnose. Some of the acoustic features it can recognize are length of pauses, short or long pauses, and variability of the length of the pauses. Thomas Campbell, a professor of behavioral and brain sciences at the University of Texas at Dallas says, "The researchers’ automated approach to screening provides an exciting technological advancement that could prove to be a breakthrough in speech and language screening of thousands of young children across the United States"(Hardesty).

Image result for Automated screening for childhood communication disorders




Resources:
Hardesty, Larry. http://news.mit.edu/2016/automated-screening-childhood-communication-disorders-0922
    https://techcrunch.com/2016/09/23/machine-learning-could-automate-screening-kids-for-speech-       and-language-disorders/




Friday, October 28, 2016

Hacking For Good



Image result for tinfoil security


When someone thinks of hacking they usually think about getting robbed of their personal information such as credit card, social security, and any other personal information that the hacker wants. This can also happen to governments and companies who don't have sufficient security for their information databases and software. However, two MIT students developed a way to make hacking a beneficial use for many companies. Michael Borohovski and Ainsley Braun created the fast growing start-up company Tinfoil Security. Tinfoil Security uses commercialized scanning software that uses hacking to detect vulnerabilities in websites and alert developers and engineers to quickly fix the issues before the website goes active. Already, there are thousands of start-ups using the software to develop their website. Braun states that 75 percent of companies that have used the software scanned some form of vulnerability on their website. Tinfoil's website has a ticker showing how many vulnerabilities the software has detected so far and it is currently at 450,000. Braun says the company's number one goal is to secure the internet and end the threat from hackers. 

Tinfoils' software finds vulnerabilities by crawling websites, which is similar to Google. Instead of looking for texts and images, it looks for anywhere it can inject code to exploit vulnerabilities. The software doesn't have access to source code or anything else an external hacker would have, but instead goes through every possible entry point and attempt to see if their's a vulnerability. Currently, the software has techniques to detect 50 different vulnerabilities, including the Open Web Application Security Project’s top ten Web app risks. Every time a vulnerability is discovered the software can run anywhere from ten to a hundred tests. Currently, there are only five employees working at Tinfoil and they are constantly updating the software as new risks and attacks are detected. One of the most common vulnerabilities is insecure cookies. Let's say someone logs onto a website, while on a public wifi hotspot, it's possible for a hacker to steal an insecure cookie allowing them to pretend to be the user. On the user hand, the developer sees a description of the vulnerabilities, including its location and impact on the website, and step-by-step instructions on how to fix the vulnerabilities. The steps include specific programming languages that help fix the vulnerabilities. It's nice to see how individuals are using computer science to counter hackers who are using computer science for unlawful purposes. 


Example of list of vulnerabilities found on a website:

Image result for tinfoil security


Resources:
https://www.tinfoilsecurity.com/about
http://news.mit.edu/2014/tinfoil-security-catches-web-vulnerabilities-0917
https://www.cloudflare.com/apps/tinfoil-security/


Friday, October 21, 2016

LED-filled "Robot Garden" Making Coding More Appealing




Image result for LED-filled “robot garden”

The "robot garden" is dozens of changing LED lights and a hundred or more origami robots that can swim, crawl, and blossom like flowers. It was developed by a team at MIT's Computer Science and Artificial Intelligence Lab. The garden is controlled by any Bluetooth tablet-operated system that illustrates their modern research on varying algorithms through the robotic sheep, origami flowers that can blossom and change colors, and robotic ducks that can change shape when put into an oven. Researchers say the "robot garden" is a visual symbol of their latest work in computing, as well as an artistically appealing way to attract young adults to learn programming.

The system is controlled by simple "control by click" feature or "control by code" feature. "Control by click" feature allows you to control the system by clicking on individual flowers, while "control by code" feature allows you to control the garden by implementing your own commands and programs in real time. Students' ability to see their code in a physical environment causes them to understand how programming is a cool and unique ability to have. The system has sixteen tiles connected via Arduino controllers and programmed through search algorithms that test the space in different ways. One of these algorithms is 'graph-coloring' which ensures no two adjacent tiles share the same color. The garden tests different algorithms for over 100 robots, allowing a lot of experimentation on the system. For example, an MIT researcher developed a system that uses object-recognition algorithms to make robots water, harvest, and take different metrics of a vegetable garden. The "robot garden" is an example of how young students and adults need to experience the real world applications of programming in order to motivate them to understand and appreciate the unique and innovative aspects of coding,


Video Illustrating how it works:





References:
http://cacm.acm.org/news/183473-can-an-led-filled-robot-garden-make-coding-more-accessible/fulltext
http://news.mit.edu/2015/can-led-robot-garden-make-coding-more-accessible-0218
https://blog.adafruit.com/2015/02/23/can-an-led-filled-robot-garden-make-coding-more-accessible-code-robotics-womeninstem/

Friday, October 14, 2016

Solving the Issue of Drug Errors



Image result for Medeye

MIT graduate entrepreneurs Gauti Reynisson and Ívar Helgason worked for hospitals and medicare companies implementing medication safety technologies, when they realized a major health issue. 1.5 million patients in the United States experience prescription medication errors every year due to drug administration mistakes. They decided to return to MIT to find a solution to this health issue and created the MedEye. Advertised and developed by the startup Mint Solutions, MedEye has made it's way to being utilized by hospitals in the Netherlands. It has caught the attention of the medical community and the Dutch discovered ten percent of MedEye's scans caught medication errors. Mint Solutions goal is to aid nurses by selling them the MedEye in order to help them efficiently and correctly administer prescription medication. Currently, Mint Solutions is working with Dutch health care to spread the MedEye to fifteen more hospitals in countries including the UK, Belgium, and Germany.

Image result for Medeye


In order to use the MedEye, a patient must have a wristband with a barcode. The nurse scans the barcode which accesses the patients' medical record. Then the nurse puts the prescribed pills into the MedEye tray. The MedEye uses a small camera to scan the pills in order to analyze their size, shape, color, and markings. Finally, the computer science comes into play when the software distinguishes pills by grouping them in a database through the use of algorithms. What's impressive is the innovation of MedEye's software, which updates and cross-references the results in the patient's medical record. The results are illustrated by color-coded boxes, green means it was correctly prescribed and red means it was wrong or unknown. What makes the MedEye unique, Helgason says it requires no change in a hospitals' workflow or logistics, "it's more usable and accessible in health care facilities"(Stop Drug Errors). It;s great to see how computer science is becoming an important part in the innovation and growth of medicare and the administration of drugs.

References:
http://mintsolutions.eu/medeye-landing-en/#medeye-nurse-1
http://news.mit.edu/2014/startup-stops-drug-errors-0828
http://impressivemagazine.com/2013/11/02/medeye-system-reduces-medication-errors/

Friday, October 7, 2016

Detecting Emotions with Computer Science





In relationships it can sometimes be difficult to interpret what your friend or loved one is truly feeling at any time. Most of our judgements are based off of facial expressions and what they are saying. However, we all know people tend to mask their emotions because they're afraid of what others will think or do so because they want to, for example a poker face. Now with the help of computer science we can uncover the masks of society and find out what people are really feeling. MIT's Computer Science and Artificial Intelligence Laboratory researchers have worked together to create the "EQ-Radio." The EQ-Radio uses wireless signals to detect what someone is really feeling. It can detect whether somebody is happy, sad, excited, or angry by measuring any changes in breathing and heart rhythms. MIT project lead and professor Dina Katabi believes the system will be used in entertainment  and health care across the world. It could also be used to detect the consumer behavior towards a product or business.

The EQ-Radio is unique compared to other technology focusing on detecting emotions. Existing emotion detecting technology systems use audiovisual cues or on-body sensors. Both systems are unreliable because facial expressions can be masked and on-body sensors can be very uncomfortable and innacurate if its constantly moving around. The EQ-Radio uses wireless signals that are sent to someone and then reflected off their body going back to the device. Then the system has programmed algorithms that convert the reflections down into individual heartbeats. The device analyzes these heartbeats to measure levels of arousal and positive effect. These measurements are what give the EQ-Radio the power to detect different emotions. So if someone has low levels of arousal and negative effect then they're sad and if the levels of arousal are high and theres positive effect then they're excited. On the other hand, the EQ-Radio has been tested to only be accurate 87 percent of the  time, so if you have a really good poker face you still might be able to deceive the device.







References:
http://news.mit.edu/2016/detecting-emotions-with-wireless-signals-0920
http://eqradio.csail.mit.edu/
https://www.engadget.com/2016/09/20/eq-radio-wireless-signals-emotion-detector/