Flipkart Search Bar

Amazon

Showing posts with label robotics. Show all posts
Showing posts with label robotics. Show all posts

Quadriplegic woman gets chocolate fix using thought-controlled robotic arm


Quadriplegic Jan Scheuermann prepares to take a bite out of a chocolate bar she is guiding...
Quadriplegic Jan Scheuermann prepares to take a bite out of a chocolate bar she is guiding into her mouth with a thought-controlled robot arm while research assistant Brian Wodlinger, Ph.D., watches on (Photo: UPMC)

Earlier this year, a 58 year-old woman who had lost the use of her limbs was successfully able to drink a cup of coffee by herself using a robotic arm controlled by her thoughts via a brain computer interface (BCI). Now, in a separate study, another woman with longstanding quadriplegia has been able to feed herself a chocolate bar using a mind-controlled, human-like robot arm offering what researchers claim is a level of agility and control approaching that of a human limb
University of Pittsburgh School of Medicine and University of Pittsburgh Medical Center (UPMC) developed the system that was tested by Jan Scheuermann, 52, from Pittsburgh. A mother of two, she was diagnosed 14 years ago with spinocerebellar degeneration, a degenerative brain disorder that left her paralyzed from the neck down.
UPMC neurosurgeon Elizabeth Tyler-Kabara, who is also an assistant professor at the Department of Neurological Surgery, Pitt School of Medicine, placed two electrode grids with 96 tiny contact points into regions of Schneuermann’s motor cortex that controlled right arm and hand movements.
“Prior to surgery, we conducted functional imaging tests of the brain to determine exactly where to put the two grids,” Tyler-Kabara said. “Then we used imaging technology in the operating room to guide placement of the grids, which have points that penetrate the brain’s surface by about one-sixteenth of an inch.”
These electrodes, picking up signals from individual neurons, were connected to a robotic hand powered by a computer running algorithms that detected real or imagined movements, like lifting an arm or rotating a wrist. The signals were then translated into instructions for the robotic arm, mimicking the way an unimpaired brain sends signals to move limbs.
Before the end of three months, Scheuermann was able to maneuver the hand, which she called Hector, accurately and precisely. With the ability to flex the wrist back and forth, move it from side to side and rotate it clockwise and counter-clockwise, as well as grip objects, the system offers what the researchers refer to as control in seven dimensions (7D). Using her mind, Scheuermann was able to instruct Hector to pick up blocks, tubes and a ball and put them in a tray.
Jan Scheuermann stacks cones with a mind-controlled robot arm (Photo: UPMC)
After some practice, she was able to perform the movements to 91.6 percent accuracy. And she could do it 30 seconds faster than at the start of trial. Researchers found this to be clinically significant, opening the way for future developments in similar prostheses and further innovations that would change the lives of the disabled.
Using this technique, Scheuermann could feed herself chocolate – a goal she had voiced at the start of the trial. The researchers applauded her as she performed this feat less than a year later. “One small nibble for a woman, one giant bite for BCI,” Scheuermann quipped.
Professor Andrew Schwartz from the Department of Neurobiology at Pitt School of Medicine said the breakthrough was unique.
“This is a spectacular leap toward greater function and independence for people who are unable to move their own arms,” Schwartz said. “In developing mind-controlled prosthetics, one of the biggest challenges has always been how to translate brain signals that indicate limb movement into computer signals that can reliably and accurately control a robotic prosthesis.
"Most mind-controlled prosthetics have achieved this by an algorithm which involves working through a complex 'library' of computer-brain connections," Schwartz added. "However, we've taken a completely different approach here, by using a model-based computer algorithm which closely mimics the way that an unimpaired brain controls limb movement. The result is a prosthetic hand which can be moved far more accurately and naturalistically than previous efforts."
The next big step for the BCI technology could be to stimulate the brain to generate sensation by using a two-way electrode system. That would allow the user to "feel" objects and loosen their grip to pick up delicate ones or tighten it for a firmer grasp.
And after that, according to lead investigator Assistant Professor Jennifer Collinger, anything is possible. “It might even be possible to combine brain control with a device that directly stimulates muscles to restore movement of the individual’s own limb,” she said.
The team's study is published in The Lancet.

KUBI – the inexpensive telepresence "robot" for tablets


Revolve Robotics has developed an interactive tablet stand allowing you to pan and tilt du...
Revolve Robotics has developed an interactive tablet stand allowing you to pan and tilt during video calls

The burgeoning telepresence market continues to gather steam with Revolve Robotics the latest venture to introduce a low-cost telepresence system through crowd-funding. But unlike the others, KUBI (Japanese for "neck") is mainly stationary. It's essentially a tablet stand that can hold a tablet in portrait or landscape mode and allows the caller to remotely pan and tilt the tablet to change their point of view.
The device can hold tablets with screen sizes ranging from 7.9" to 10.6", which can be tilted left and right 300 degrees, and up and down 90 degrees by callers using an app available for iOS and Android.
“We wanted to distill where the value is in a telepresence product,” Revolve Robotics founder Marcus Rosenthal told Wired. “We realized it’s the ability to look around the room and be present, not necessarily move from room to room.”
That may seem like an odd statement, given that mobility is the main selling point of telepresence robots versus video calling. However, KUBI doesn't really qualify as a robot. It's a simple, cost-effective addition to standard teleconferencing systems, but chances are good that, unless you're having a round-table meeting, you probably won't find that it adds any significant value to a standard video call.
KUBI's adjustable connector can hold tablets of varying sizes in portrait and landscape mo...
Smaller mobile telepresence stands for smartphones – such as the Heliosand Botiful – were designed to provide virtually the same functionality in a smaller package.
While the Helios failed to reach its funding goal, the Botiful, which works with an iPhone or Android phone, successfully scraped past its goal and will begin shipping next year. The difference is, the Botiful is able to move around a room, allowing you to chase a pet or toddler.
You can view Revolve Robotics' KUBI in the following preview video and track its progress at its indiegogo campaign page. If it reaches its funding goal, KUBI is expected to retail for US$250.
Source: Revolve Robotics via Wired , Gizmag

MIT developing a robotic "Swiss Army knife" that changes shape to suit the job


The MIT milli-motein
An MIT team is developing a robot that has the potential to become possibly the most versatile machine ever. Referred to by the team as the "robotic equivalent of a Swiss Army knife,” the milli-motein robot is made up of a chain of tiny modules each containing a new type of motor that can be used to form the chain into various shapes. This shape-changing capability could lead to the creation of robots that dynamically change their form to suit the task at hand.
Diagram of the magnet assembly of the MIT milli-motein
Diagram of the magnet assembly of the MIT milli-motein
The MIT team of Neil Gershenfeld, head of MIT’s Center for Bits and Atoms, visiting scientist Ara Knaian and postdoctoral associate Kenneth Cheung call this idea “programmable matter.” Ideally, what the team would like to achieve is a device that can change to become whatever is required at the time. If you need a hammer, it’s a hammer. If you need a spanner, it’s a spanner.
This device would be made up of tiny chains of mechanical modules strung together. That may sound a lot like other wormlike robots, but its purpose is more ambitious than just wiggling through cracks. With a paper published last year by another MIT team mathematically proving it is theoretically possible to reproduce any 3D shape by folding a sufficiently long string, such a chain modeled on protein molecules could be folded into any 3D shape.
The MIT milli-motein is based on a protein molecule seen here in simulation
The MIT milli-motein is based on a protein molecule seen here in simulation
As a first step in developing programmable matter, the MIT team produced the milli-motein robot. Its name stands for MILLImeter MOtor proTEIN and is currently the world’s smallest chain robot. Created under a DARPA grant at MIT’s Center for Bits and Atoms, each of its modules is smaller than a cubic centimeter and is built around an entirely new kind of motor.
The milli-motein’s modules are too small to use conventional gears and motors, so in order to keep the design simple and the costs down, the team came up with a motor that is strong, yet maintains its position when turned off.
They call it an electropermanent motor and its principle is similar to that of scrapyard electromagnets used to lift cars. Flying in the face of intuition, some types of these electromagnets are actually turned off while they’re lifting scrap iron. That’s because they’re permanent magnets paired with a weaker electromagnet. When the electromagnet is switched on, it cancels the stronger permanent magnet’s field and the scrap drops.
In the milli-motein, coils switch off the permanent magnets that make up the rotor’s arms in sequence and cause it to turn. When the power is off, the motor locks. This means that the module is very energy-efficient, cheap, simple and needs no gears. It also makes the motor relatively powerful for its size. The motors can currently only lift one other segment in the robot’s chain, but the team hopes the use of lighter materials will enable it to lift two or three.
Exploded view of the MIT milli-motein
Exploded view of the MIT milli-motein
When the milli-motein modules are linked in a chain, each module gets an instruction to turn left, right or straight. It works like DNA code in that, like a DNA pair, it carries out an independent function without any reference to the task as a whole. However, the team learned that using the modules in chains rather than as discrete units allowed for much more efficient control since commands can be passed down the line so that the individual “dumb” modules operate in harmony.
The final size of later versions of the milli-motein robot will depend on the size of the task and chains can range from the size of proteins to that of a human being. Even if the robot concept goes nowhere, the new motor has already sparked interest in a number of industries.
The MIT video below explains the mechanics of the milli-motein robot.
Source: MIT , GIzmag

MIT spin-off Robot Rebuilt working on sensitive robotic hands


While at MIT, Robot Rebuilt's founder Torres-Jara built a robot with highly sensitive hand...
While at MIT, Robot Rebuilt's founder Torres-Jara built a robot with highly sensitive hands, called Obrero

Robot manipulators – or hands, as we like to call them – come in all shapes and sizes. Some, like those developed for Willow Garage's PR2, have just two fingers. Others have three, four, or five fingers – and some manage to lift objects with none at all. Now, an MIT spin-off called Robot Rebuilt is hitting up Boston venture capital firms to develop a manipulator with human-like sensitivity.
"I was inspired by the ridges humans have on their fingers. We wanted to make a robotic hand that would mimic that, and achieve some of the same sensitivity our hands have," says Eduardo Torres-Jara, who developed the robot hand at MIT's Computer Science and Artificial Intelligence Lab under the supervision of Rodney Brooks (founder of iRobot and Rethink Robotics). Torres-Java's company is working out a technology license with MIT, while he works as an assistant professor at Worcester Polytechnic Institute.
Torres-Jara's Obrero robot, seen here about to lift a ceramic mug
Torres-Jara's Obrero robot, seen here about to lift a ceramic mug
At MIT he built a robot called Obrero, which had a compliant hand with position and force control of its three fingers, allowing them to conform to an object's shape. Additionally, the fingers and palm contained eight force sensors, five position sensors, and seven high-resolution tactile sensors – the details of which have since been removed from Obrero's project page. However, Obrero is described as being "as much about perception as action and is intrinsically responsive to the properties of the object being manipulated; manipulation that does not rely on vision as the main sensor but as a complement."
Robot Rebuilt's new robot, Tactico, would continue where Obrero left off. "We're starting with the hand, but we're also working on building an arm," Torres-Java says in an interview with Boston.com. "Right now, PhDs come in at 3 AM to take care of their experiments," he adds, referencing the CNC-milled parts that must be removed from the machine's work area. "Our robot could do that." It would use a camera to detect objects around it, and would pick them up with enough strength to lift them without them slipping or breaking.
You can watch Obrero handling objects in the videos below.
Source: Robot Rebuilt via Boston.com , Gizmag

Disney Research robot can juggle, play catch


The robot created at Disney Research can play catch with a human partner

With the aim of providing some physical interaction between entertainment robots and guests at its theme parks, while still maintaining a safe distance between the two, Disney Research has created an animatronic robot that can play catch and juggle balls with a human partner.
Kalman filter algorithm is used to analyze video captured on an external camera system consisting of a Kinect-like ASUS Xtion PRO LIVE to track a colored ball in three-dimensional space and predict its destination and timing. The ball's predicted location is then relayed to the robot, which moves its hand accordingly.
The balls need to be thrown in the general vicinity of the robot’s hand, which has been modified with a cup-like shape to boost its catching ability. The camera system also tracks the human thrower’s head to orientate the robot towards them and the robot’s head will move to give the impression it is tracking the ball through the air with its eyes.
View from the external camera system
Caught balls are thrown back 2.5 meters (8 ft) to the thrower, while the developers have given the robot several different animations that play out when it drops a ball. These include a shaking of the head, looking behind, looking down, or a shrug of the shoulders.
The developers were also able to speed up the throw/catch cycle to give the robot the ability to juggle three balls with a human partner. While not quite on a par with the three-fingered robotic hand developed at the Ishikawa Komuro laboratory at the University of Tokyo, it's still pretty impressive.
There’s no word on when visitors to Disney theme parks can expect to enjoy a game of catch with a robot, but we don't imagine the technology will be used in Disney World’s Hall of Presidents.
The video below shows the Disney Research robot's catching and juggling ability.

iRobot's new hand can take a beating from a baseball bat


iRobot tests the durability of its new robot hand by smashing it with a baseball bat

Not even a baseball bat can damage the fingers on a new robotic hand developed by iRobot for the DARPA Autonomous Robotic Manipulation (ARM) program. The four-year program, which began in 2010, seeks to build and program a robot capable of handling all kinds of things on the battlefield with minimal human input. Most robot hands have rigid components which tend to be quite fragile, but this hand has rubbery fingers, which are better able to absorb impacts.
Its fingers may look a bit flimsy, but the team at iRobot is turning that into an advantage. Rather than having to be super precise, the compliant nature of the fingers allows the robot to drag tiny objects off of smooth surfaces, like a key on a table, with relative ease. And the hand can carry a payload of up to 50 pounds (22.7 kg), which is a significant improvement over NASA'sRobonaut R2, which can handle less than half that.
Those features may come in handy (no pun intended) as the ARM program moves into its testing phase, which will require the robot to write with a pen, use pliers, unzip a duffel bag, drill a hole with a power tool, insert a key to unlock a door, and assemble an object from a parts kit. One of the tests involves unpinning a grenade and throwing it. Clearly, this robot is destined for war, but it could help to save soldier's lives.
Watch as the hand's fingers are repeatedly smashed with a baseball bat in the following video.

The Social Drink Machine takes your order via Facebook and Twitter


The Social Drink Machine is a robotic bartender that takes your order via Facebook or Twit...

Robofun, which bills itself as the largest open-source hardware store in Romania, has built a robotic bartender called The Social Drink Machine. It takes its inspiration from another recently created "botender," The Inebriator, which the team at Robofun felt could be improved with a social media interface. They built their own robot from scratch in just 10 days and added Facebook and Twitter apps that let you order drinks from a mobile phone.
All you have to do is scan a QR code near the robot, and you'll be whisked to a Facebook app that lets you choose a drink. If you'd prefer, you can tweet "gimme drinks @socialdrinkbot" to access the Twitter app. Then you plop down your glass on a holding tray, and the Arduino-powered robot does the rest. An even earlier robotic bartender, built at the University of Saarland in 2006, could take your order using speech recognition.
Sure, it's a bit too slow to replace human bartenders, but that's not really the point. The team created the robot to demonstrate what they can do. According to Viorel Spinu, the CEO and founder of Robofun, the Social Drink Machine is being leased for corporate events, and they're building an enhanced version for a local beverage brand. They have no plans to market it. You can watch it doing its thing in the following video.
Source: Robofun Create via 
href="http://gizmodo.com/5960475/definitive-proof-that-robots-and-qr-codes-are-not-the-future-of-bartending" style="background-color: transparent; color: #1e8dd7; text-decoration: initial;" target="_blank">Gizmodo

Teaching robots new tricks without programming


A test subject teaches the PR2 robot how to fold a t-shirt through demonstration

Don't believe what the sci-fi movies tell you. When it comes to understanding our world, robots are stupid. Like computers, robots only do what we program them to do. And that's a big problem if we're ever going to realize the dream of practical robot helpers for the masses. Wouldn't it be great if anyone could teach a robot to perform a task, like they would a child? Well, that's precisely what Maya Cakmak has been working on at Willow Garage.
Cakmak, a researcher from Georgia Tech, spent the summer creating a user-friendly system that teaches the PR2 robot simple tasks. The kicker is that it doesn't require any traditional programming skills whatsoever – it works by physically guiding the robot's arms while giving it verbal commands.
After inviting regular people to give it a try, she found that with few instructions they were able to teach the PR2 how to retrieve medicine from a cabinet and fold a t-shirt. Such tasks may be easy for us, but for a robot they are very difficult. That's why most scientists don't take the threat of a robopocalypse very seriously just yet – they know how difficult it is to get a robot to do anything even remotely useful.
Test subjects were provided instructions on how to teach the robot similar to what you'd e...
Test subjects were provided instructions on how to teach the robot similar to what you'd expect when buying a sophisticated appliance
Teaching by demonstration isn't going to replace traditional programming, because robots will still require some degree of common sense to function properly in our uncertain world.
For example, Rethink Robotics' new industrial robot, Baxter, uses a combination of the two. This allows anyone to quickly and easily program the robot to perform manipulation tasks on a production line, like picking up an object over here and moving it over there. This is made possible in part because Baxter uses its own artificial intelligence too, like image processing software, to pinpoint the exact positions of widgets placed randomly in front of it.
The potential for Cakmak's system would multiply as data is shared across a network of robots. In theory, you could eventually have software routines capable of folding any type of clothing, or loading any type of cookware into a dishwasher. However, some local instruction would still be required as each household is unique and no amount of advance programming can account for all the little differences in layout (the precise location of a sock drawer, for example).
But while we wait for robots to gain the artificial intelligence to carry out various household chores by themselves with nothing more than a verbal command, Cakmak's approach could help robots become truly flexible household helpers.
Check out version 1.0 of Cakmak's system in the video below.
Source: Willow Garage , Gizmag

Vanderbilt University steps into the exoskeleton market

Test subject Brian Shaffer uses the prototype exoskeleton

For people who are unable to walk under their own power, exoskeletons offer what is perhaps the next-best thing. Essentially “wearable robots,” the devices not only let their users stand, but they also move their legs for them, allowing them to walk. While groups such as Berkeley Bionics, NASA, Rex Bionics, and ReWalk are all working on systems, Nashville’s Vanderbilt University has just announced the development of its own exoskeleton. It is claimed to offer some important advantages over its competitors.
The Vanderbilt device attaches to the user’s torso, legs and feet. Computer-controlled electric motors in its hip and knee joints are activated by the wearer, bending their legs for them and moving them forward. As with other exoskeletons for the physically-challenged, the wearer still needs to use crutches or a walker in order to keep their balance.
Vanderbilt University has developed an exoskeleton for the physically challenged, which is...
As with a Segway, if the user leans forward, the machine will begin to walk them in that direction. Leaning backward and holding that position for a few seconds causes it to bring them to a seated position. If they subsequently sit forward for several seconds, it will bring them back up to a standing position.
According to the university, its exoskeleton is much sleeker and less obtrusive than other models – at a weight of 27 pounds (12.25 kg), it’s also said to be considerably lighter. This means that users could carry it packed and folded on the back of their wheelchair, then unpack it and strap it on (without any assistance) when they wished to use it. They could also wear it when seated in their wheelchair, if they anticipated going back and forth between the two.
Additionally, the amount of assistance provided by the motors automatically adapts to the amount of muscle control the user still has in their legs – this allows them to maintain a degree of physical fitness. Other products reportedly just provide “all the power all the time.”
It is also said to be the only exoskeleton that utilizes functional electrical stimulation. This is a rehabilitative technique in which paralyzed muscles are made to contract and relax via the application of small electrical pulses. It can reportedly build leg strength in people with incomplete paraplegia, while improving circulation, changing bone density and reducing muscle atrophy in complete paraplegics.
The Vanderbilt exoskeleton attaches to the user’s torso, legs and feet
The exoskeleton was developed by a team led by Prof. Michael Goldfarb, who previously developed a bionic leg prosthesis that anticipates its user’s moves. Vanderbilt has licensed the exoskeleton technology to Parker Hannifin Corporation, which hopes to have a commercial version ready to go by 2014. A price estimate hasn’t been announced, although it is hoped to be significantly less expensive than other devices.
The prototype can be seen in use in the video below.


Source: Vanderbilt University , Gizmag

Korea shows off salad-tossing robot at Robot World 2012

CIROS intelligently slices a cucumber with a kitchen knife

Researchers from the Korean Institute of Science and Technology's (KIST) Center for Intelligent Robotics (CIR) demonstrated their household service robot, CIROS, at Robot World 2012. CIROS, the third version of the robot since development began in 2005, is intended to help out around the home by performing simple chores. You can watch it prepare a salad by slicing a cucumber and adding dressing in the video below.
According to a KIST official, CIROS is able to recognize common objects as well as kitchen appliances like microwaves, sinks, refrigerators, and dishwashers, and can move intelligently through its environment. The robot's artificial intelligence is the result of collaboration between robotics labs at several top-ranking Korean institutions including Seoul National University, the Korean Advanced Institute of Science and Technology (KAIST), Korea University, Sungkyukwan University, Sogang University, and the Pohang University of Science and Technology (POSTECH). As such, CIROS represents the latest in Korean robotics technology.
The robot's head contains stereoscopic cameras and a 3D IR sensor similar to the Microsoft Kinect, which it uses to recognize objects. Furthermore, robust speech recognition is made possible with a 12-piece microphone array. CIROS stands 5'3" (160 cm) tall, weighs a hefty 330 pounds (150 kg), and moves in any direction thanks to its wheeled base. It can detect and avoid obstacles in its vicinity thanks to a pair of laser range finders and six ultrasonic sensors in its body. And its dexterous hands, identical to those developed for HUBO (another robot developed separately at KAIST), can hold a variety of objects and tools.
CIROS loads a dishwasher
CIROS loads a dishwasher
Earlier versions of the robot poured beverages from juice dispensers, and delivered them on a serving tray. Photos from the lab suggest CIROS is also capable of intelligently loading and unloading a dishwasher. Eventually, the researchers plan to build and program a robot that can perform every step of serving a meal, from its preparation through to tidying up. That won't be a reality for several years, but progress is being made slowly but surely. Similar projects are also underway in the United States, Germany, and Japan, and researchers compete in RoboCup @Home, an annual competition to build household robots that can perform simple tasks like serving a bowl of cereal.
Besides CIROS, the researchers at KIST CIR are also developing a bipedal humanoid robot named KIBO, and educational robots designed to teach students English lessons. A separate lab at KIST also works on bipedal robots with technology giant Samsung.
Robot World 2012 ran from October 25 to 28, and attracted many businesses to display their industrial manufacturing robots, robot vacuum cleaners, commercial service robots, and educational toys and kits.
Source: KIST CIR (Korean) via Robot World 2012 (Korean)

Virginia Tech's CHARLI-2 robot dances Gangnam Style

Researchers at Virginia Tech had some fun with their RoboCup soccer champ
Researchers at Virginia Tech had some fun with their RoboCup soccer champ

Just in case you haven't had your fill of PSY's viral K-POP sensation, the researchers at Virginia Tech's Robotics and Mechanisms Laboratory (RoMeLa) have put out a new video of their robot dancing Gangnam Style. While the robot named CHARLI-2 doesn't display any fancy footwork in the video, some of its walking and balancing technology is being implemented in the Navy's Autonomous Shipboard Humanoid (ASH).
Already the team at RoMeLa, led by Dr. Dennis Hong, have developed a pair of legs based on CHARLI-2's lower half called SAFFiR (Shipboard Autonomous Firefighting Robot), which will have to navigate in tight corridors and smoky environments later this year.
The Office of Naval Research (ONR) has been working on a firefighting robot called Octavia since 2010, but that robot is rather bulky and moves on wheels, which limits where it can go. Eventually the Navy would like to combine aspects of both projects into ASH, which will wear a protective suit to prevent it from overheating.
The CHARLI-2 spends most of its time advancing artificial intelligence in RoboCup soccer matches. In 2011 and 2012 it came out on top in the league's Adult size division, which typically involves a kicker and goalie. It sports 25 degrees of freedom and stands 4 feet 7 inches tall (141 cm), but is designed to be ultra light weight, weighing less than 33 pounds (15 kg). It can currently walk at a max speed of about 0.86 miles per hour (1.4 km/h).
CHARLI-2's PSY-inspired dance moves can be seen in the video below.
Source: RoMeLa website via YouTube, Gizmag

Tightrope walking bipedal robot

Dr. Guero's Primer-V4 robot works its way across the tightrope
Dr. Guero's Primer-V4 robot works its way across the tightrope

A Japanese roboticist that goes by the handle Dr. Guero, famous for programming his hobby robot to ride a miniature bicycle and walk on stilts, has managed to get his robot to balance on a tightrope. His Primer-V4 robot is based on the Kondo KHR-3HV hobby kit (which can be purchased for around US$1,800), but features a few modifications that give it the ability to inch its way along a steel wire just over an eighth of an inch (4 mm) thick.
The following video shows Dr. Guero's robot as it slides its way along the wire, making minute adjustments to its balance by waving its arms. On his website, Dr. Guero explains that the robot's arms move in different directions based on signals from its inclination sensor. He had to modify the feet by adding grooves to help catch the wire – which seems like a fair modification given that human performers will use their toes to do the same thing. He also replaced the standard arms with ones that have fewer servos and parts to provide better balance.
Dr. Guero (aka Masahiko Yamaguchi) has worked at some prestigious labs, including Japan's National Institute of Advanced Industrial Science and Technology, Boston Dynamics, and Osaka University. His nickname, chosen by his wife, comes from the long-running comic and animated television show Dragon Ball Z, whose Dr. Gero is an evil scientist behind the Red Ribbon Androids. You can watch the tightrope trick below, and check out his earlier projects on his website.
Source: Dr. Guero's website (Japanese), Gizmag

Flipkart Offers Zone

Contact Form

Name

Email *

Message *