For decades, academic and industry researchers have been working on control algorithms for autonomous helicopters — robotic helicopters that pilot themselves, rather than requiring remote human guidance. Dozens of research teams have competed in a series of autonomous-helicopter challenges posed by the Association for Unmanned Vehicle Systems International (AUVSI); progress has been so rapid that the last two challenges have involved indoor navigation without the use of GPS.

But MIT’s Robust Robotics Group — which fielded the team that won the last AUVSI contest — has set itself an even tougher challenge: developing autonomous-control algorithms for the indoor flight of GPS-denied airplanes. At the 2011 International Conference on Robotics and Automation (ICRA), a team of researchers from the group described an algorithm for calculating a plane’s trajectory; in 2012, at the same conference, they presented an algorithm for determining its “state” — its location, physical orientation, velocity and acceleration. Now, the MIT researchers have completed a series of flight tests in which an autonomous robotic plane running their state-estimation algorithm successfully threaded its way among pillars in the parking garage under MIT’s Stata Center.

Additional footage courtesy of: Adam Bry, Nicholas Roy, Abraham Bachrach of the Robust Robotics Group, Computer Science and Artificial Intelligence Laboratory, Department of Aeronautics and Astronautics at Massachusetts Institute of Technology.

Special thanks to the Office of Naval Research under MURI N00014-09-1-1052 and the Army Research Office under the Micro Autonomous System Technologies program.

With complete due respect, I didn't found its such a gr8 at all… Its not so complex algorithm as well, me too belongs to AI(Artificial Intelligence) and NN(Neural Networking) field.. Expect lot more from MIT. 🙂 Cheers !!!

you bet. bootstrap like crazy. what he said in the video was an extremely high level description of something much more finicky, mostly the state estimation. marrying the LIDAR with accelerometers and gyroscopes sounds like a huge pain, however a system like this would be a huge boon to something like APM, the arduino pilot open source project

i've been meaning to look at ROS, didn't realize they had features that varied already. that's only half the story though, the other half is the state estimation using that simplified aerodynamics model. although that might be possible using a rudimentary physics engine for games

Fermat's last Theorem is the transformer bumblebee Robot. Original equation: z^n=x^n+y^n. Mean: z^(n-3)*z^3=x^(n-3)*x^3+y^(n-3)*y^3. Using the formula z^3=[z(z+1)/2]^2 – [z(z-1)/2]^2 to convert z^3 become the exponent 2. Then using the formula [z(z+1)/2]^2=1^3+2^3+……..+z^3 to convert the exponent 2 become the exponent 3. Repeated several times with the same method. The transformer bumblebee Robot was created according your own structure.

Simplest format about Transformer Bumblebee Robot. Using two formulas: z^3=[z(z+1)/2]^2 – [z(z-1)/2]^2 And define x<x+a<y. x^3+y^3=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a)^3+(x+a+1)^3+……..+(y-1)^3] Because: (x+a)^3= [(x+a)(x+a+1)/2]^2 – [(x+a)(x+a-1)/2]^2 So also: x^3+y^3=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+a)(x+a+1)/2]^2 + [(x+a)(x+a-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a+1)^3+……..+(y-1)^3]

Original equation: z^3=x^3+y^3. According to above method, the transformer bumblebee Robot system was created: [z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+a)(x+a+1)/2]^2 + [(x+a)(x+a-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a+1)^3+……..+(y-1)^3]

…….. Flood robot on the planet Orion galaxy looking for the integers to eat. Certainly not enough the integer for the large number of transformer bumblebee Robot

This is only possible with todays fast processors, light weight lasers, and advanced batteries, Only more proof that self driving cars will be in the future.

can the problem for dynamic navigation be fix by adding some kind of motor that gives the back tail more pivoting motion, or would it make the plan loose balance.

BIGPEN1065abbo me new trickshot channel

RONI RENGITAwesumm….how do you combine camera info with the laser unit to know about its localization wrt to the environment????

GEEKEA. COM…and then SkyNet ! hehe… great !

Felar Furlongwell done.

too bad for humanity, John Connor isn't born yet.

tapiwa112I think the camera is just for observation, the range finder and the inertial measurement unit are gathering all the data

MusicMusicMusicwhen is Google going to use theses to do street maps fly by's? 2013?

Alperen Gencall cool things come from MIT

MrJoe64well done!

Quantum Breaki tho first that it was controled but its not which is something epic 😀

LexokaGiven how thin the Martian atmosphere is, that might be tricky.

lolealajaggressSo very cool!

deekal_elquick, before Skynet arises…

Daniel HouSo, how it land on the floor?

Pratik Revankarwow that ws so vry cool..

RyukachooPLEASE RELEASE THE CODE OPEN SOURCE! I WANT TO EXPAND ON THIS SO BADLY

GoldPilesMy Investment Trading System Works very similar to this…. but i designed my trading system a few years back..

Jay Lozthere was no traveling forward in time in terminator

Amol KhadeWith complete due respect, I didn't found its such a gr8 at all… Its not so complex algorithm as well, me too belongs to AI(Artificial Intelligence) and NN(Neural Networking) field.. Expect lot more from MIT. 🙂 Cheers !!!

SierraHoly shit. Robots FTW.

Ravi M. FernandoYou mean you want to cut and paste the code and see if you can be on par with MIT engineers.. The guy pretty much told you how to do in the video.

Ryukachooyou bet. bootstrap like crazy.

what he said in the video was an extremely high level description of something much more finicky, mostly the state estimation. marrying the LIDAR with accelerometers and gyroscopes sounds like a huge pain, however a system like this would be a huge boon to something like APM, the arduino pilot open source project

Fotis DimeasMarrying LIDAR, accelerometers and gyroscopes is already implemented on ROS. Look for laser_scan_matcher

Ryukachooi've been meaning to look at ROS, didn't realize they had features that varied already. that's only half the story though, the other half is the state estimation using that simplified aerodynamics model. although that might be possible using a rudimentary physics engine for games

Tbi Robotics LtdVery good!

Julian St AmandI just hope MIT won't strap a bomb and let it fly away

SW7This is so fucking cool! Greetings from an Imperial College London Aeronautics student!

Игорь Василенкобомба

hottor godwincool

KIRAN KULKARNIflight without GPS is Amazing……….. inside a parking place … great work guys!!!!!

prajñā prajñāFermat's last Theorem is the transformer bumblebee Robot.

Original equation:

z^n=x^n+y^n.

Mean:

z^(n-3)*z^3=x^(n-3)*x^3+y^(n-3)*y^3.

Using the formula z^3=[z(z+1)/2]^2 – [z(z-1)/2]^2 to convert z^3 become the exponent 2. Then using the formula [z(z+1)/2]^2=1^3+2^3+……..+z^3 to convert the exponent 2 become the exponent 3.

Repeated several times with the same method.

The transformer bumblebee Robot was created according your own structure.

Simplest format about Transformer Bumblebee Robot.

Using two formulas:

z^3=[z(z+1)/2]^2 – [z(z-1)/2]^2

And define x<x+a<y.

x^3+y^3=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a)^3+(x+a+1)^3+……..+(y-1)^3]

Because:

(x+a)^3= [(x+a)(x+a+1)/2]^2 – [(x+a)(x+a-1)/2]^2

So also:

x^3+y^3=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+a)(x+a+1)/2]^2 + [(x+a)(x+a-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a+1)^3+……..+(y-1)^3]

Original equation:

z^3=x^3+y^3.

According to above method, the transformer bumblebee Robot system was created:

[z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+a)(x+a+1)/2]^2 + [(x+a)(x+a-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+a-1)^3+(x+a+1)^3+……..+(y-1)^3]

[z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+b)(x+b+1)/2]^2 + [(x+b)(x+b-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+b-1)^3+(x+b+1)^3+……..+(y-1)^3]

[z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+c)(x+c+1)/2]^2 + [(x+c)(x+c-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+c-1)^3+(x+c+1)^3+……..+(y-1)^3]

[z(z+1)/2]^2 – [z(z-1)/2]^2=[y(y+1)/2]^2 – [x(x-1)/2]^2 – [(x+d)(x+d+1)/2]^2 + [(x+d)(x+d-1)/2]^2 – [(x+1)^3+(x+2)^3+……..+(x+d-1)^3+(x+d+1)^3+……..+(y-1)^3].

……..

Flood robot on the planet Orion galaxy looking for the integers to eat.

Certainly not enough the integer for the large number of transformer bumblebee Robot

ADIEU.

Scott ElderVery cool –

Good work guys!Oehcs Inc.This is only possible with todays fast processors, light weight lasers, and advanced batteries, Only more proof that self driving cars will be in the future.

EpicAerialVery Impressive Flight Platform, I wonder if takeoff and is landing fully autonomous as well?

ncdefencemanThis is one of the coolest things I've seen from MITnews, I'm going to look into building one of these, but with some modifications to it!

AVKttForgotten but actual

Jacob David C. CunninghamThat's amazing. I wonder if DARPA employed your team.

SpooksThis is incredible.

Cesar Melendezthis would be cool to buy

Vladimir AshkanikovPossible to share the source code?

Robert Lopezcan the problem for dynamic navigation be fix by adding some kind of motor that gives the back tail more pivoting motion, or would it make the plan loose balance.

Janne K.Impressive