Recent News & Events

Women who Inspire Speaker Series: Global Entreprenuers Driving Change

  • Event Date:
    Wednesday November 19th, 2014
  • Time:
    5:00pm
  • Location:
    Egan Research Center

Programmed for Success

James Mor­rison Klein, a fifth-​​year infor­ma­tion sci­ence major, recently returned from a six-​​month inter­na­tional co-​​op in France, where he gained valu­able expe­ri­ence and con­fi­dence as a soft­ware devel­oper at a startup called Copass.

Based in Paris, Copass is a global mem­ber­ship net­work that grants users access to more than 256 inde­pen­dent co-​​working spaces in 42 dif­ferent coun­tries with one single account. “Copassers” can use the space for as long as they need, whether that is a day or a month, and get the perks of a social experience.

By Amer­ican stan­dards, Copass is a startup,” Klein explained, noting that the com­pany has part­nered with Airbnb to help trav­elers find accom­mo­da­tions. “It was a very tightknit team with only five employees, including myself, and I really enjoyed the freedom that came with working with a startup as self-​​sufficient yet suc­cessful as Copass.”

As a soft­ware devel­oper for the startup, Klein built web appli­ca­tions from scratch and main­tained the company’s cur­rent soft­ware. A trusted member of the small team, his respon­si­bil­i­ties often extended beyond pro­gram­ming; he was fre­quently involved in every­thing from dis­cussing long-​​range busi­ness deci­sions to offering input on helping the office run effec­tively on a daily basis. “There were a huge slew of things beyond the pro­gram­ming,” he said.

Klein also became well versed in agile soft­ware devel­op­ment, which pro­motes adap­tive plan­ning, con­tin­uous improve­ment, and empha­sizes quick response to changing require­ments. “Agile devel­op­ment is huge in the United States now because star­tups need quick pro­to­typing,” Klein said. “There were times where I’d come in and need to learn a new frame­work and pro­gram­ming lan­guage in one day.”

Thank­fully, Klein found a sup­portive mentor at Copass, who helped guide him through the new assign­ments. This encour­age­ment not only helped him become a more effec­tive inde­pen­dent worker, but it also boosted his con­fi­dence by instilling in him the belief that there was nothing he couldn’t do. “It was extremely empow­ering, from a tech­nical per­spec­tive, and I’m extremely con­fi­dent in my abil­i­ties since returning from the co-​​op.”

His expe­ri­en­tial learning oppor­tu­nity at Copass built on his co-​​op with Her­cules Tech­nology Growth Cap­ital, a ven­ture cap­ital firm in Palo Alto, Cal­i­fornia, where he pro­grammed internal soft­ware for client management.

The new skills he learned at Copass will come in handy in the class­room and in his career, said Klein, noting that both his co-​​op employers have dis­cussed the pos­si­bility of hiring him as a full-​​time employee after graduation.

A Stimulation Game to Help People Prep for Court

Preparing for court and appearing before a judge can be a daunting expe­ri­ence, par­tic­u­larly for people who are rep­re­senting them­selves because they can’t afford a lawyer or simply don’t know all the ropes of the legal process.

That’s why an inter­dis­ci­pli­nary team of North­eastern fac­ulty, staff, and stu­dents from the School of Law, the Col­lege of Arts, Media and Design, and the Col­lege of Com­puter and Infor­ma­tion Sci­ence is devel­oping an online sim­u­la­tion that would pro­vide self-​​represented lit­i­gants with advo­cacy expe­ri­ence before they appear in court for real.

The sim­u­la­tion game is par­tic­u­larly tar­geted to the growing number of people who cannot afford legal rep­re­sen­ta­tion and thus rep­re­sent them­selves in legal pro­ceed­ings ranging from evic­tions and mort­gage fore­clo­sures to child cus­tody pro­ceed­ings and debt col­lec­tion cases. Nation­ally, more than 80 per­cent of people with legal prob­lems must resolve them without the assis­tance of a lawyer.  When a dis­pute lands in court, people without any legal training find them­selves addressing a judge, ques­tioning wit­nesses, and offering doc­u­ments into evidence.

The sim­u­la­tion game would let indi­vid­uals try out these kinds of expe­ri­ences in a vir­tual world before they appear in an actual court­room. It would ulti­mately be made avail­able online for free.

This is the begin­ning of some­thing that could be trans­for­ma­tional in the legal system,” said Dan Jackson, exec­u­tive director of the NuLawLab. Jackson and NuLawLab fac­ulty director Martha Davis are leading the project for North­eastern in tandem with pro­fes­sors Casper Harteveld and Gillian Smith, who work in Northeastern’s Playable Inno­v­a­tive Tech­nolo­gies Lab. Law stu­dents enrolled this winter in a newly cre­ated lab sem­inar on applied design and legal empow­er­ment will also be involved in the project, which offi­cially begins Jan. 1.

The project was one of 17 world­wide nom­i­nated by the Hague Insti­tute for the Inter­na­tion­al­iza­tion of Law for its Inno­vating Jus­tice Award – Inno­v­a­tive Idea 2014. Ear­lier this month, the North­eastern “vir­tual court­room” project received the most votes (988) of the nom­i­nees and will now have a shot at being named a finalist for the award in November at the institute’s Inno­vating Jus­tice Forum in the Netherlands.

The North­eastern team will work with project lead Statewide Legal Ser­vices of Con­necticut and New Haven Legal Assis­tance. The project has already received funding through the Legal Ser­vices Corporation’s Tech­nology Ini­tia­tive Grants program.

The project builds upon the NuLawLab’s work exploring new ways of deliv­ering legal assis­tance and edu­ca­tion to lawyers in order to pro­vide more people with access to their legal rights. It also con­tinues Harteveld and Smith’s “cit­izen sci­ence” projects in which the users them­selves can con­tribute to sci­en­tific research through these game-​​based plat­forms. They are devel­oping a game called “Mad Sci­ence” that aims to foster a cul­ture of curiosity and learning by allowing users (the “mad sci­en­tists”) to create their own vir­tual exper­i­ments and recruit friends to participate.

In addi­tion to pro­viding self-​​represented par­ties with foun­da­tional advo­cacy expe­ri­ence, project leaders said the “vir­tual court­room”  would help build a com­mu­nity of sup­port around these people’s needs. They envi­sion users even­tu­ally being able to com­mu­ni­cate with each other, share their court­room expe­ri­ences, and help first-​​timers nav­i­gate the process.

The team said com­mu­nity par­tic­i­pa­tion is both a unique aspect and a dri­ving force of the project. North­eastern will lead col­lab­o­ra­tive design work­shops attended by com­mu­nity stake­holders, including judges, court clerks, attor­neys, people who have already rep­re­sented them­selves in court, and those in the midst of doing so. Researchers will gather infor­ma­tion and feed­back during the game’s testing phase and deploy­ment, from which they will learn about how players respond in these vir­tual court situations—data that can be used to improve the sim­u­la­tion and to learn about human decision-​​making and com­mu­nity building.

We will work closely with the com­mu­nity to design this game and max­i­mize its impact,” Harteveld said.

Fingertip Sensor Gives Robot Unprecedented Dexterity

MIT-GelSight-01_0Researchers at MIT and Northeastern University have equipped a robot with a novel tactile sensor that lets it grasp a USB cable draped freely over a hook and insert it into a USB port.

The sensor is an adaptation of a technology called GelSight, which was developed by the lab of Edward Adelson, the John and Dorothy Wilson Professor of Vision Science at MIT, and first described in 2009. The new sensor isn’t as sensitive as the original GelSight sensor, which could resolve details on the micrometer scale. But it’s smaller — small enough to fit on a robot’s gripper — and its processing algorithm is faster, so it can give the robot feedback in real time.

Industrial robots are capable of remarkable precision when the objects they’re manipulating are perfectly positioned in advance. But according to Robert Platt, an assistant professor of computer science at Northeastern and the research team’s robotics expert, for a robot taking its bearings as it goes, this type of fine-grained manipulation is unprecedented.

“People have been trying to do this for a long time,” Platt says, “and they haven’t succeeded because the sensors they’re using aren’t accurate enough and don’t have enough information to localize the pose of the object that they’re holding.”

The researchers presented their results at the International Conference on Intelligent Robots and Systems this week. The MIT team — which consists of Adelson; first author Rui Li, a PhD student; Wenzhen Yuan, a master’s student; and Mandayam Srinivasan, a senior research scientist in the Department of Mechanical Engineering — designed and built the sensor. Platt’s team at Northeastern, which included Andreas ten Pas and Nathan Roscup, developed the robotic controller and conducted the experiments.

Synesthesia

Whereas most tactile sensors use mechanical measurements to gauge mechanical forces, GelSight uses optics and computer-vision algorithms.

“I got interested in touch because I had children,” Adelson says. “I expected to be fascinated by watching how they used their visual systems, but I was actually more fascinated by how they used their fingers. But since I’m a vision guy, the most sensible thing, if you wanted to look at the signals coming into the finger, was to figure out a way to transform the mechanical, tactile signal into a visual signal — because if it’s an image, I know what to do with it.”

A GelSight sensor — both the original and the new, robot-mounted version — consists of a slab of transparent, synthetic rubber coated on one side with a metallic paint. The rubber conforms to any object it’s pressed against, and the metallic paint evens out the light-reflective properties of diverse materials, making it much easier to make precise optical measurements.

In the new device, the gel is mounted in a cubic plastic housing, with just the paint-covered face exposed. The four walls of the cube adjacent to the sensor face are translucent, and each conducts a different color of light — red, green, blue, or white — emitted by light-emitting diodes at the opposite end of the cube. When the gel is deformed, light bounces off of the metallic paint and is captured by a camera mounted on the same cube face as the diodes.

From the different intensities of the different-colored light, the algorithms developed by Adelson’s team can infer the three-dimensional structure of ridges or depressions of the surface against which the sensor is pressed.

Although there are several ways of measuring human tactile acuity, one is to determine how far apart two small bumps need to be before a subject can distinguish them just by touching; the answer is usually about a millimeter. By that measure, even the lower-resolution, robot-mounted version of the GelSight sensor is about 100 times more sensitive than a human finger.

Plug ‘n play

In Platt’s experiments, a Baxter robot from MIT spinout Rethink Robotics was equipped with a two-pincer gripper, one of whose pincers had a GelSight sensor on its tip. Using conventional computer-vision algorithms, the robot identified the dangling USB plug and attempted to grasp it. It then determined the position of the USB plug relative to its gripper from an embossed USB symbol. Although there was a 3-millimeter variation in where the robot grasped the plug, it was able to measure its position accurately enough to insert it into a USB port that tolerated only about a millimeter’s error.

“Having a fast optical sensor to do this kind of touch sensing is a novel idea,” says Daniel Lee, a professor of electrical and systems engineering at the University of Pennsylvania and director of the GRASP robotics lab, “and I think the way that they’re doing it with such low-cost components — using just basically colored LEDs and a standard camera — is quite interesting.”

How GelSight fares against other approaches to tactile sensing will depend on “the application domain and what the price points are,” Lee says. “What Rui’s device has going for it is that it has very good spatial resolution. It’s able to see heights on the level of tens of microns. Compared to other devices in the domain that use things like barometers, the spatial resolution is very good.”

“As roboticists, we are always looking for new sensors,” Lee adds. “This is a promising prototype. It could be developed into practical device.”

Article from MIT News