Quantcast
Channel: Bitcraze
Viewing all 661 articles
Browse latest View live

Optical flow

$
0
0

We have recently released a few products with optical flow sensors (the Flow deck and the Flow Breakout board) without really talking about the concept of optical flow. So we though we would dedicate this weeks post to it.

The most common example of optical flow is probably a computer mouse. Turning the mouse over you’ll see a strong light that’s used to illuminate the surface so that a camera can clearly see the surface. When running, the camera will identify features in the surface below it and track their motion between frames. As you move the mouse to the left, features will move to the right.  

In the example below you can see a feature being tracked over time.

optical flow

The feature is tracked from frame to frame and the output is the distance that the feature moved since the previous frame. 

The functionality of an optical flow sensor of course depends on being able to find features to track, a surface that is very uniform will be hard to track since all the frames will look the same. If you’ve ever tried using a mouse on a glass table or reflective surface you’ve probably seen that it doesn’t work.

The same concept is used in our Flow products. It also happens that the manufacturer of the chip we use, PixArt, is a world leader in optical mouse sensors. They have applied the same concepts as for the mouse but with a different lens that gives the camera the ability to track features further away (80mm – inf). Like the mouse this is dependent on finding features to track, which might be problematic on poorly lit surfaces or on surfaces that are very uniformly colored. On the other hand if the area is too lit up from the ceiling above you when you fly you might start tracking your own shadow on the floor.

One of the issues with using optical tracking from a flying platform is that you need to know the distance to the features. In the case of the mouse you will know that the features are right under the mouse, but in the case of the flying platform you won’t know this from only looking at the image. Think about sitting on a plane and watching the ground move, it’s really slow. But your movement along the ground will actually be really fast. For our Flow products we’ve added the VL53L0x ToF distance sensor to measure the distance to the surface that’s being tracked. This completest the equation so if you’re further away from the features that are being tracked this will be taken into account. Note that the accuracy of the tracking will decrease when the distance increases since the difference between frames becomes smaller and harder to detect.

An optical flow sensor can also be used to track motion of other objects instead. Suppose the optical flow sensor is fixed and pointing sideways, then it will detect objects passing in front of it, for instance counting people passing a doorway, or it could be used as a touch less mouse.

 


A Flying Gripper based on Modular Robots

$
0
0

Grasping objects is a hard task that usually implies a dedicated mechanism (e.g arm, gripper) to the robot. Instead of adding extra components, have you thought about embedding the grasping capability to the robot itself? Have you also thought about whether we could do it flying?

In the GRASP Laboratory at the University of Pennsylvania, we are concerned about controlling robots to perform useful tasks. In this work, we present the Flying Gripper! It is a novel flying modular platform capable of grasping and transporting objects with the help of multiple quadrotors (crazyflies) working together. This research project is coordinated by Professor Mark Yim and Professor Vijay Kumar, and led by Bruno Gabrich (PhD candidate) and David Saldaña (Postdoctoral researcher).

 

 

Inspiration in Nature

In nature, cooperative work allows small insects to manipulate and transport objects often heavier than the individuals. Unlike the collaboration in the ground, collaboration in air is more complex especially considering flight stability. With this inspiration, we developed a platform composed of four cooperative identical modules where each is based on a quadrotor (crazyflies) within a cuboid frame with a docking mechanism. Pair of modules are able to fly independently and physically connect by matching their vertical edges forming a hinge. Four one degree of freedom (DOF) connections results in a one DOF four-bar linkage that can be used to grasp external objects. With this platform we are able to change the shape of the flying vehicle during flight and use its own body to constrain and grasp an object.

Flying Gripper Design and Motion

In the proposed modular platform, we use the Crazyflie 2.0. Its battery life lasts around seven minutes, though in our case battery life time is reduced due to the extra weight. The motor mounting was modified from the standard design, we tilted the rotors 15 degrees. This was necessary as more yaw authority was required to enable grasping as a four-bar. However, adding this tilt reduces the lifting thrust by 3%. Axially aligned cylindrical Neodymium Iron Boron (Nd-FeB) magnets, with 1/8″ of diameter and 1/4″ of thickness are mounted on each corner enabling edge-to-edge connections. The cylindrical magnets have mass of 0.377g and are able to generate a force of 0.4 kg in a tangential connection between two of the same magnets. This forms a strong bond when two modules connect in flight. Note that the connections are not rigid – each forms a one DOF hinge.

The four attached modules results in a one DOF four bar linkage in addition to the combined position and attitude of the conglomerate. The four-bar internal angles are controlled by controlling the yaw attitude of each module. For example, two modules rotate clockwise and other two modules rotate counter-clockwise in a coordinated manner.

 

 

Grasping Objects

Collaborative manipulation in air is an alternative to reduce the complexity of adding manipulator arms to flying vehicles. In the following video we are able to see the Flying Gripper changing its shape in midair to accomplish the complex task of grasping a wished coffee cup. Would you like some coffee delivery?

 

 

 

This work was developed by:

Bruno Gabrich, David Saldaña, Vijay Kumar and Mark Yim

Additional resources at:

http://kumarrobotics.org/

http://www.modlabupenn.org/

Maker faire Shenzhen summery

$
0
0

So we are now back in the cold and dark Sweden after about a weeks visit to a warm and nice Shenzhen, China. Every time we go there something major has happened. When we visited last time, about a year ago, cash was king. Now apparently payments are done with QR codes, even in small lunch restaurants. And I was kind of proud about the BankID and Swish payments we have here is Sweden, until now… Another observation we did was that there are now a lot of colorful rental bikes which can be found about everywhere and which can be rented for around 1 RMB/hour. A great way of resource sharing and pushing Eco-friendly transportation. It has it downside though as piles of bikes could be commonly found and e.g. written about by theguardian.

Aside from the above observations the Maker Faire Shenzhen was one of the reasons we came to visit. As Shenzhen is called the “the silicon valley for hardware” we had pretty high expectations when coming to the Maker Faire. Even though it was a great Faire it did not really reach our high expectations but it is growing fast and I’m pretty sure in a couple of years it is the Maker Faire to be at. A quick summarize, robotics was one of the top categories of products on the faire. 3D printers which are popular on European and US faires was not that common which surprised us. Now let the pictures do the talking:

 

We exhibited on the faire sharing booth with Seeedstudio where we showed an autonomous sequence on top of a table using the flow deck. By pressing a button, the Crazyflie 2.0 would take of, fly in a circle, come back and land roughly in the same spot. It was a very engaging demo catching many peoples attention and especially the kids. The kids constantly wanted to press the button and interact with the Crazyflie.

[See image gallery at www.bitcraze.io]

All the interaction made us very happy and next time we will try to add the obstacle avoidance deck to make it even more engaging.

 

Unfortunately the Crazyradio PA is out of stock in our store and is estimated back around December 1. Until then please checkout our distributors for availability.

 

ModQuad – Self-Assemble Flying Structures

$
0
0

Modular robots can adapt and offer solutions in emergency scenarios, but self-assembling on the ground is a slow process. What about self-assembling in midair?

In one of our recent work in GRASP Laboratory at University of Pennsylvania, we introduce ModQuad, a novel flying modular robotic structure that is able to self-assemble in midair and cooperatively fly. This work is directed by Professor Vijay Kumar and Professor Mark Yim. We are focused on developing bio-inspired techniques for Flying Modular Robots. Our main interest is to develop algorithms and controllers for self-assembling modular robots that can dock in midair.

In biological systems such as ant or bee colonies, collective effort can solve problems not efficient with individuals such as exploring, transporting food and building massive structures. Some ant species are able to build living bridges by clinging to one another and span the gaps in the foraging trail. This capability allows them to rapidly connect disjoint areas in order to transport food and resources to their colonies. Recent works in robotics have been focusing on using swarm behaviors to solve collective tasks such as construction and transportation.Ant bridge. Docking modules in midair offers an advantage in terms of speed during the assembly process. For example, in a building on fire, the individual modules can rapidly navigate from a base-station to the target through cluttered environments. Then, they can assemble bridges or platforms near windows in the building to offer alternative exits to save lives.

ModQuad Design

The ModQuad is propelled by a quadrotor platform. We use the Crazyflie 2.0. The vehicle was chosen because of its agility and scalability. The low-cost and total payload gives us an acceptable scenario for a large number of modules.

Very light-weight carbon fiber rods connected by eight 3-D printed ABS connectors form the frame. The frame weight is important due to tight payload constraints of the quadrotor. Our current frame design weighs 7g about half the payload capability. The module geometry has a cuboid shape as seen in the figure below. To enable rigid attachments between modules, we include a docking mechanism in the modular frame. In our case, we used Neodymium Iron Boron (NdFeB) magnets as passive actuators.

Self-Assembling and Cooperative Flying

ModQuad is the first modular system that is able to self-assemble in midair. We developed a docking method that accurately aligns and attaches pairs of flying structures in midair. We also designed a stable decentralized modular attitude controller to allow a set of attached modules to cooperatively fly. Our controller maximizes the use of the rotor forces by generating the maximum possible moment.

In order to allow the flying structure to navigate in a three dimensional environment, we control thrust and attitude to generate vertical and horizontal translations, and rotation in the yaw angle. In our approach, we control the attitude of the structure in a decentralized manner. A modular attitude controller allows multiple connected robots to stably and cooperatively fly. The gain constants in our controller do not need to be re-tuned as the configurations change.

In order  to dock pairs of flying structures in midair, we propose to have two flying structures: the first one is hovering and waiting, meanwhile the second one is performing a docking action. Both, the hovering and the docking actions are based on a velocity controller. Using a velocity controller, we are able to dock multiple robots in midair. We highlight that docking robots in midair is one of the fastest ways to assemble structures because the building units can rapidly move and dock in three dimensional environments. The docking system and control has been validated through multiple experiments.

Our system takes advantage of robot swarms because a large number of robots can construct massive structures.

 

 

This work was developed by:

David Saldaña, Bruno Gabrich, Guanrui Li, Mak Yim, and Vijay Kumar.

 

Additional resources at:

http://kumarrobotics.org/

http://www.modlabupenn.org/

http://davidsaldana.co/

 

Anchor position estimation in the client

$
0
0

As I wrote about in a previous blog post, I have been working on an anchor position estimation algorithm in the Crazyflie Client. The algoritm uses ranging data from the Loco Positioning system to estimate where the anchors are located, and thus remove the need to measure their positions in the room. I have finally reached a point where I think it is good enough to let it out from the lab and it has been pushed to the client repository.

A button has been added to the Loco Positioning tab that opens a wizard. In the wizard the user is asked to place the Crazyflie in certain positions to record ranges and define the coordinate system. If all goes well, the estimated anchor positions are transfered to the anchor position fields in the Loco Positioning tab. If the user is happy with the result the next step is to write the positions to the anchors and start flying!

Now to the disclaimer: the results may not always be perfect – surprise! We have not tested the algorithm a lot but it seems to give decent results, at least it can be useful as a base for manual corrections and sanity checks. Some of the estimated positions are pretty good, while others might be a meter or so off. The conclusion is that you should not trust it blindly, check that the estimated positions seem reasonable before flying.

Currently the system only supports Two way ranging, but extending it to TDoA should not be too complicated. There are probably many possible improvements that can be done, and we hope that everyone that finds this interesting and have ideas of how to do it will give it a go. After all, it is open source and we would love to see contributions refining the functionality, now that there is a base to build from.

Any feed back is welcome, let us know if it works or not in your setup!

New TDoA2 algorithm merged into Master

$
0
0

The Loco Positioning System (LPS) default working mode is currently Two Way Ranging (TWR), it is a location mode that has the advantage of being pretty easy to implement and gives good positioning performance for most use cases and anchor setups. This was a very good reason for us to start with it. Though, TWR only supports positioning and flying of one or maybe a couple of Crazyflies, while it is not a solution to fly a swarm.

One solution to fly a swarm is an algorithm called Time Difference of Arrival (TDoA). We have had a prototype implementation for a while but we experienced problems with outliers, most of them where due to the fact that we where loosing a lot of packets and thus using bad data.

To solve these issues, TDoA2 makes two changes:

  • Each packet has a sequence number and each timestamps is associated with the sequence number of the packet it has been created from
  • The distances between anchors are calculated and transmitted by the anchors

A slightly simplified explanation follows to outline why this helps (a more detailed explanation of how TDoA works is available in the wiki).

We start by assuming that all timestamps are available to the tag, this is done by transmitting them in the packets from the anchors to the copter.

The end goal is to calculate the difference of time of arrival between two packets from two different anchors. Assuming we have the transmission time of the packets in the same clock, all we need to do is to subtract the time between the two transmissions with the time between the two receptions:

0 – anchor 0, 1 – anchor 1, T – Tag (that is the LPS deck on the Crazyflie)

To do so we need to have the time it took for the packet to travel between the two anchors, this will enable us to calculate the transmit time of P2 in anchor 1, this can be done by calculating the TWR time of flight between the two anchors, this would require the tag to receive 3 packets in sequence:

So now for the part where TDoA2 helps: previously we had to have the 3 packets in sequence in order to calculate a TDoA, if any one of these where missing the measurement would fail or worse, it could give the wrong result. Since we did not have sequence numbers, it was hard to detect packet loss. Now that we have sequence numbers, we can understand when a packet is missing and discard the faulty data. We also do not have to calculate the distance between anchors in the tag anymore, it is calculated by the anchors themselves. This means that we can calculate a TDoA with only two consecutive packets which increases the probability of a successful calculation substantially.

To reduce packet loss even more, we have also added functionality to automatically reduce the transmission power of the NRF radio (the one talking to the Crazyradio dongle) when the LPS deck is detected. It has turned out that the NRF radio transmissions are interfering with UWB radio reception, and since most indoor use cases does not require full output power we figured that this was a good trade-off.

The results we have seen with the new protocol is quite impressive: TDoA is usually very sensitive to the tag being inside the convex hull, so much so that with the first TDoA protocol we had to start the Crazyflie from about 30cm up to be well within the convex hull. This is not required anymore and the position is still good enough to fly even a bit outside of the convex hull. The outliers are also greatly reduced which makes this new TDoA mode behave very close to the current TWR mode, but with the capability to locate as many Crazyflies as you want:

Added to that, we have also implemented anchor position handling in the TDoA2 protocol and this means that it is now as easy to setup a system with TDoA2 as with TWR:

We are now working on finishing the last functionality, like switching between algorithms (TWR and TDoA) and on writing a “getting started guide”. When that is done TDoA will become an official mode for the LPS.

In the mean time, if you are adventurous, you can try it yourself. It has been pushed in the master branch of the Crazyflie firmware and the LPS node firmware. You should re-flash the Crazyflie firmware, both STM32 and nRF51, from master and the anchors from master too.

The STEM drone bundle

$
0
0

We’ve been seeing an increase in the demand for a “programmable drone”, where users can easily give simple commands though scripting and the Crazyflie 2.0 following them. In order for this to work well you need a closed-loop control, i.e you need a reference system to see how you’re moving. Previously this was only possible using external camera systems or bulky on-board cameras. But a while ago we released the Flow deck which solves this problem. Thanks to the mouse-like sensor the deck contains it enables the Crazyflie 2.0 to see how it’s moving along the floor. Suddenly it’s possible to give commands like “move 1 m forward” or “fly in a clock wise circle with the radius of 1 m”.

To make it easier for users to pick out the parts needed we’ve put together a discounted STEM drone bundle. It contains all the parts needed for scripting the flight. If you have a gamed-pad or a Bluetooth LE enabled phone you can of course fly it manually as well 🙂

To quickly get up and running, we have written a getting started guide. There is also a great hackster project, Beginner’s Guide to Autonomous Quadcopters by community member Chathuranga Liyanage, containing more details.

Happy holidays from Bitcraze!

$
0
0

This year we decided to celebrate the holiday by painting a Christmas tree, rather than dressing one like last year. What better way to do this that with the flow deck,  a LED-ring and a long exposure photo. To check out all the yummy details and how to DIY check out this hackster project we made. Also as an Christmas extra we made this light painting video with the LED-ring mounted on top of the Crazyflie 2.0 and a bit of video editing. To be able to mount the LED-ring on top we hacked together an inverting deck. Not a bad idea and something we aim to release in the future!

 

Getting started

For those of you that was lucky and got a Crazyflie 2.0 under the Christmas tree, here is a short intro to get you started.

You can find all our getting started guides in the “Tutorials” menu on www.bitcraze.io. Take a look at “Getting started with the Crazyflie 2.0” to see how to assemble the kit and take off for your first flight. If you have an expansion deck you will also find a guide for how to install it.

Development

When you are comfortable flying the Crazyflie you might feel that it is time for the next step, to make use of the flexibility of the platform. After all it is designed to be modified!

Check out the “Getting started with development” tutorial to set up your development environment, build your first custom firmware and download it to the copter.

Maybe you want to add a sensor or some other hardware? Heat up your soldering iron and dive in to it! Find more information about the expansion bus on the wiki. The wiki is the place to look for all product and project documentation.

All source code is hosted on github.com/bitcraze and this is also where you will find documentation related to each repository. 

Projects

Looking for inspiration for a project? Take a look at hackster.io or read our blog postsThe video gallery contains some really cool stuff as well as our You Tube channel.

Contribute!

Open source is about sharing, creating something awesome together and contribute to the greater good! Whenever you do something that you think someone else could benefit from, please contribute it! If you were curious or confused about something, someone else probably will too. Help them by sharing your thoughts, insights and discoveries.

Why not

Need help?

Can not find the solution to a problem? Don’t understand how or what to do? Have you read all documentation and are still confused? Don’t worry, head over to the forum and check if someone else had the same problem. If not, ask a new question on the forum and get help from the community.

Happy holidays from the Bitcraze team!


Recap of 2017

$
0
0

It is now the first day in 2018 and a good day to look back at 2017. Its been a busy year as always and we have had a lot of fun during the year. One of the first things popping up is that things takes so much longer then we think. Luckily we are working with open source and the progression is not only dependent on us as we have awesome help from the community. We are already really excited about what’s coming in 2018, looking forward to working together with so many great people!  

Community

The Crazyflie 2.0 is still gaining attention and are becoming more and more popular among universities around the world. We see interest from researchers working with autonomous systems, control theory, multi-agent systems, swarm flight, robotics and all kinds of research fields, which is really great. This means that a lot of exciting work have been contributed by the community, so here is a small summary of what has happened in the community during the year.

In the beginning of the year the Multi-Agent Autonomous Systems Lab at Intel Labs shared how the Crazyflie 2.0 is used in their research for trajectory planning in cluttered environments. We wrote a blog post about this if you want learn more about their work. The Crazyflie showed up on the catwalk of Berlin Fashion week being part of fashion designer Maartje Dijkstras futuristic creation TranSwarm Entities”, a dress made out of 3D prints accompanied by autonomously flying Crazyflies.

For the third year Bitcraze visited Fosdem. We had a good time and got to hang out with community members like Fred how did a great presentation about what’s new in the Crazyflie galaxy. During the conference we took the opportunity to present the Loco positioning system and demo autonomous flight with the Crazyflie controlled by the Loco positioning system. In the demo we flew with the non-linear controller from Mike Hammer using trajectory generation from Marcus Greiff

We have had a few interesting blog post contributions during the year from major universities. Including a guest post written by researchers at Carnegie Mellon University. The researchers are using the Crazyflie 2.0 drone to create an adaptive multi-robot system. Similar work has been done by the researchers at the Computer Science and Artificial Intelligence Lab at MIT were they have been studying coordination of multiple robots, developing multi-robot path planning for a swarm of robots that can both fly and drive.

We have also had two interesting guest blog post from the GRASP Laboratory at University of Pennsylvania, the “A Flying Gripper based on Modular Robots” and “ModQuad – Self-Assemble Flying Structures“. Inspired by swarm behavior in nature, for instance how ants solve collective tasks, both projects explore the possibilities of how multiple Crazyflies can work together to perform different missions.

During the fall Fred took the time to pay us a visit at the office in Sweden and worked together with us. He is making great progress on the Java Crazyflie lib that is going to be used in the Android client as well as in PC clients. It will allow to connect and use a Crazyflie from any Java program, there has already been some successful experimentation done using it from Processing

Some other great news is that thanks to Sean Kelly the Crazyflie 2.0 is now officially supported by the Betaflight flight controller firmware. Betaflight is a flight controller firmware used a lot in the FPV and drone racing community.

Thanks to denis on the forum, there is now support for Crazyflie 2.0 in the PX4 flight controller firmware. PX4 is a comprehensive flight controller firmware used in research and by the industry.

Finally The Crazyswarm project, by Wolfgang Hoenig and James A. Preiss from USC ACTlab has been presented at ICRA 2017. It is a framework that allows to fly swarms of Crazyflie 2.0 using a motion capture system.  There is currently some work done on merging the Crazyswarm project into the Crazyflie master branch, this will make it even easier to fly a swarm of Crazyflie. In the meantime the project is well documented and can be used by anyone that has a couple of Crazyflies and a motion capture system.

Hardware

During 2017 we released four new products. Beginning with the Micro SD-card deck which e.g. makes high speed logging possible. Then the Z-ranger that enables a height hold flight mode up to 1m above ground. We like to call it drone surfing as that is very much what it feels like when flying. We ended by releasing two boards, Flow deck and Flow breakout, in collaboration with Pixart containing their new PMW3901 optical flow sensor. The Flow deck enables scriptable flight which is very exiting. That lead us to release the STEM drone bundle which we hope will inspire people to learn more about flying robotics.

Hardware prototypes, our favorite sub-category, are something we have plenty of lying around here at the office. To name a few, a possible Crazyradio 2, the Loco positioning tag, the Crazyflie RZR, the Glow deck or Obstacle avoidance/SLAM deck. It takes a long time making a finished product… Hopefully we will see more of these during 2018!

Software

At the same time we released the Flow deck we also released the latest official Crazyflie 2.0 FW and client (2017.06). This enables autonomous capabilities as soon as the Flow deck is inserted by automatically turning on the corresponding functionality. Just before that, the loco positioning was brought out of early access with improved documentation and simplified setup. Since then a lot of work has been put into making a release of TDoA and improving overall easy of use. With the TDoA2 and automatic anchor estimation starting to work pretty well we should not be far from a new official release!

We would like to end 2017 with a big thank you to our users and community with this compilation video. Make sure to pump up the volume!

video link

Visiting 34C3 and other conferences

$
0
0

A couple of weeks ago, I was visiting 34C3, the Chaos Communication Congress, with Fred. The trip was not ‘official’ business for Bitcraze but more of a personal interest, the Congress is a great place to be and I hope to be able to go next year. While there, we found out that Foosel was there too, she is the developer and maintainer of the Octoprint project (Our 3D printer would be much less useful without Octoprint …). It was awesome to finally meet her in real life, she has been in the Crazyflie comunity since the beginning and we have never been able to meet even though we did a couple of maker faire in Germany.

Meeting the community in person is always awesome, this is one of the best part of going to conferences.

At the end of the month we will be at FOSDEM in Belgium, Fred will be there too, he is planning to demo Crazyflie at the Eclipse booth. If anyone else is coming please let us know, we can improvise a Crazyflie meetup there.

Later in the year, in good Bitcraze style, we have not planned anything yet. Last year we went to ICRA which was a very good experience and we might be leaning for IROS this year. Let us know if there is any conference at which you would like to meet us and we will consider going.

Crazyflie in artistic projects : making of La Mouche Folle

$
0
0

This week we have a guest blog post by Ben, enjoy!

I’m Ben Kuperberg and i’m a digital artist, artist-friendly software developer and orchestra conductor. Being a juggler, I’ve decided to focus some of my work on the intersection between juggling and technology, and i’ve since been working more and more with jugglers, my last project being “Sphères Curieuses” from Le Cirque Inachevé, created by Antoine Clée. While the whole project is not focused on drones, a part of it involves synchronized flight of multiple drones and precise human interaction with those drones. Swarm flight is something already out there and some solutions already exist but the context of this project added some challenges to it.

Most work on drone swarms have been done by research group or school. They use high-grade expensive motion-capture system able to track precisely the drones and able to assign their absolute positions. While the quality of the result is undeniable, it’s not fit for stage shows : the setup is taking a lot of time which we can’t always have when the show is on the road. Moreover, the mocap system is too invasive for the stage if you want to be able to “hide” a bit the technology and let the spectator focus on what the artist wants you to see. Not to mention it costs an arm and a leg and Antoine needs both to juggle.

So we had to find other ways to be able to track multiple drones. That’s when we found out the [amazing] team at Bitcraze was working on the TDoA technology, which allows precise-enough tracking of a virtually unlimited amount drones, at reduced cost and with a fast and clean setup.

After some work we managed to have a first rough version of our swarm server made by Maxime Agor that allowed to connect and move multiple drones using the TDoA system, controlled from a Unity application.

While we were able to present a decent demo with this system, we were facing a major problem of reactivity. When working with artists and technology, reactivity is a key component to creativity. Because it can be frustrating and tense to stop each 2 minutes to make changes or fix problems. My first priority was therefore to prepare and design softwares that will allow me to spend most of the “creation time” on the actual creation aspect and not on technical parts. It is also essential that the artist performing in front of the audience can entirely focus on the performance and by fully confident in this technology. The last challenge is that as I focus my work on the creation and not touring, all my work needs to be easily understood and modified by both the artists and the technicians who will take over my work for the tour.

With all of that in mind, I decided to create a software with a high-end user interface called “La Mouche Folle” (« The Crazy Flie » in french) that allows to control multiple drones and have an overview of all the drones, their battery/charging/alert states, auto-connect / auto-reboot features, external control via OSC, and a Unity client to view and actually decide how to move the drones. All my work is open-source, so you can find the software on github.

There only is a Windows release for now but it should compile just fine on OSX and Linux, the software is made with JUCE, depends on OrganicUI and lib-usb. Feel free to contact me if you want more information on the software. Many thanks Wolfgang Hoenig for the support and the great work on the crazyflie cpp library i’m relying on.

So this is the basic setup of our project, but we needed more than that to control the drone. We wanted to be able to control them in the most natural way possible. We quickly decided to go with glove-base solutions, and have been working with Specktr to get our hands – pun intended – on developer versions of the glove. The glove is good but can’t give us absolute position of the hand, so we added HTC Vive trackers with the lighthouse technology and then were able to get both natural hand control and sub-millimeter precision of the tracked hand.

Then it was a matter of connecting everything together : for other projects for Theoriz Studio, I already developed MrTracker (used in the MixedReality project) that acts as a middleware between the Vive trackers and Unity.

I used Chataigne to easily connect and route the Specktr Glove data to Unity as well so we would have maximum flexibility to switch hardware or technology without breaking whole setup if we needed to.

 
A video of the final result
 

 

In the past years, i’ve come to work on a lot of different projects, with different teams, which i like very much, because each project leads to discover new people, new ways of working and new challenges to overcome. I’m having a great time working on this project and especially sharing everything with the guys at Bitcraze and the community, everyone has been so cool and nice. I’ve planned to go at the Bitcraze studios to work for few weeks with them and i’m sure it’ll be a great experience !

External positioning system support

$
0
0

We have been lucky get the opportunity to use a motion capture system from Qualisys in our flight lab. The Qualisys system is a camera based system that is using IR-cameras to track objects with sub-millimeter precision! The cameras are designed to measure the position and track small reflective marker balls that are fixed to the object to be tracked with high accuracy. By using multiple cameras shooting from different angles it is possible for the system to calculate the 3D position of a marker in space. By mounting multiple markers on an object the system can also identify the object as well as its orientation in space. Very cool!

We have started to look at how to add support in our ecosystem for the Qualisys system as well as other “external” positioning systems, external in this context is systems that calculate the position outside the Crazyflie. There is already great support for external positioning in the CrazySwarm project by the USC-ACTLab, but we are now looking at light weight support in the python client. We are not sure what we will add but ideas are on the lines of viewing an external position in the client, feed an external position into the Crazyflie for autonomous flight and maybe a simple trajectory sequencer.

MoCap Deck

We have also started to design a MoCap Deck to make it easy to mount reflective markers on the Crazyflie. Our design goals include:
* light weight
* easy to use
* support for multiple configurations to enable identification of individuals
* the possibility to add a button for human interaction

The current design of the MoCap Deck

The suggested design of the MoCap Deck

Any feedback on the MoCap Deck and ideas for functionality to add to the client is welcome! Please add a comment to this blog post or send us an email.

We will write more about the Qualisys system later on, stay tuned!

Release of the Loco Positioning System TDoA mode, a.k.a. Swarm mode

$
0
0

 

We have been writing a couple of times already about the new TDoA2 algorithm for the Loco Positioning System. A TDoA mode has been experimental from the day we released the LPS but we are now proud to announce that TDoA is an official positioning mode for the Loco Positioning System and the Crazyflie.

Practically it means that the Loco Positioning System now has an officially supported mode to locate and fly a swarm of Crazyflie 2.0.

We have worked these last weeks at updating documentation, the “Getting started” tutorial and releasing all the affected firmware and software. One of our goals was to make the new TDoA mode as seamless and as easy as possible to work with, this meant having everything working without having to recompile the Crazyflie or any other part of the system. The Crazyflie is now detecting the LPS mode automatically and it is possible to configure the anchors position and ranging mode remotely from the within Crazyflie client LPS tab.

What we have just released is:

If you have 8 anchors and want to convert your local positioning system to TDoA, this can be done very easily by following the new version of the getting started with loco positioning system guide.

If you want more information about the different positioning modes, we have also updated the system description.

 

Bitcraze VM and Crazyflie development environment

$
0
0

We just released a new version of the Bitcraze VM, version 2018.01. Nothing very new in this version, the VM has been rebuilt so that all the projects included in it are now up-to-date. This solves an issue where the Crazyflie client was blocked in the previous revision.

The current VM is running a quite old version of Ubuntu, the 14.04 LTS version. We are planning at refreshing the VM by making a new one when Ubuntu 18.04 LTS is released.

Since the Crazyflie 1 time we have been documenting the VM as a standard development environment. This has a couple of advantages:

  • We can distribute a fully setup development environment that has minimal dependencies with the host system
  • If someone has a problem with the VM, there is a bit chance we can reproduce and fix it, everyone is running the same system
  • Everything is pre-setup so it should be fairly quick to get started with the actual firmware or software development

However the VM solution also has drawbacks:

  • It requires to install and somewhat configure VirtualBox or other virtual machine software
  • It has some cost in performance, mostly for USB as it slows down the communication with the Crazyflie
  • The USB implementation seems to have bugs on Windows, which makes the communication with the Crazyflie buggy. This is currently the biggest problem!

So, the situation is not ideal, and we would love to get some feedback from the community.

There are two very different parts in the system: the lib and client in Python, and the firmwares in C.

  • Starting development of the python parts, on Windows/Mac/Linux, is fairly straightforward. Basically one has to install python and git, clone the projects, install dependencies and it runs. Different python IDEs can be used and work pretty much out of the box.
  • Starting development for the embedded C part can be a bit more challenging. On Linux and Mac it is pretty easy since it only requires to download the arm-embedded-gcc compiler and adding it to the path. On windows things are a bit more complex because you also need Make and I haven’t yet figured-out the best way to install that. Having an IDE requires to configure Eclipse CDT.

What do you think about the VM as a development environment and would you prefer other solutions like documentation for each operating system on how to install a development environment?

Multi-ranger deck update

$
0
0

During the fall we did two blog-posts (12) about a new prototype named Obstacle Avoidance/SLAM deck, but since then it’s been a bit quiet about it. So we thought it was due for an update! First of all, after a lot of discussions, we decided to rename the deck to Multi-ranger. It better describes what the board does and matches the naming of the Z-ranger. We’ve sent out some samples to customers and so far the response has been great. So we’re pushing forward and preparing for production that’s estimated to begin in March. Below is a picture of the latest prototype.

The biggest change for the final prototype is adding a LDO regulator to power the sensors. We’ve seen that depending on the settings for the sensors they might consume a lot more than when we initially tested. Using the same settings as for the Z-ranger brings the consumption to 90 mA, which together with the Crazyflie 2.0 electronics, comes close to filling the power budget for the Crazyflie 2.0 VCC LDO regulator. Aside from that we’re making some minor changes to simplify production and testing.

We’ll keep you updated on the progress!


Merging Crazyswarm functionality into the official Crazyflie firmware

$
0
0

We have seen a big interest in flying swarms of Crazyflies and there are many challenges in doing so. The USC ACT Lab has developed Crazyswarm, a collection of software and firmware that allows to fly big swarms of Crazyflie using a motion capture system. This project has been used by USC and other universities to fly the most impressive swarms of Crazyflie 2.0 to date. 

Picture from “Downwash-Aware Trajectory Planning for Large Quadrotor Teams” publication using Crazyswarm

We are very happy that we together with Wolfgang and James, the main developer of Crazyswarm, have started to merge the firmware part into the official Crazyflie firmware. Merging the code will have two great consequences: people will be able to use Crazyswarm with a Crazyflie 2.0 running the stock firmware and everybody else will be able to use functionalities that has been developed for Crazyswarm.

There is currently a couple of parts that are in the works. The state controller has been merged already. There is currently some discussion on Github on how to merge the high-level commander, a commander that would allow the Crazyflie to autonomously follow trajectories as well as other high level commands. Finally there will be some work required to adapt the Kalman filter to make it more suited to accepts measurements from a motion capture system. The Crazyflie was not developed as an autonomous platform from the beginning but it is becoming one in big part thanks to the great contributions from the community.

A great thanks to James and Wolfgang for their effort in merging CrazySwarm in the Crazyflie code-base!

Out of stock
Unfortunately we miscalculated how much China slows down during Chinese new year which has caused some products to become out of stock. One of them is the Crazyradio PA which is also causing some bundles to become out of stock as well. The good news is that the products are in transit to the warehouse and will hopefully be back in stock any day now. Until then you can use the “Item out of stock – notify me!” functionality to get notified as soon as the product is back in stock.

 

Collaboration with Qualisys

$
0
0

Qualisys is a Motion Capture (Mocap) system manufacturer based in Gothenburg in Sweden. Since we are also based in Sweden, Qualisys have been able to visit us a couple of times and we now have one of their Motion capture system installed at the office. This collaboration allows us to have access to a Mocap system, something we did not have previously. It means that we can better support people using motion capture systems with Crazyflie.

We are currently implementing support for Qualisys in the Crazyswarm project. Crazyswarm currently supports a couple of motion capture system including Vicon and Optitrack, with the addition of Qualisys we and everyone with a Qualisys system will be able to fly swarms of Crazyflie in their mocap.

We are also planning on having a combined booth, Bitcraze and Qualisys, at IROS 2018. We are planning to demo flying both with the Mocap and with the Loco Positioning System. We will keep updating on this when we have more details.

We look forward to this collaboration since it will allow us to use and better support motion capture positioning for the Crazyflie.

Collaboration with Udacity

$
0
0

We are excited to announce that the Crazyflie 2.0 and the STEM bundle has been chosen by Udacity for their Flying Car Nanodegree Program. For the students that want to try out their skills on a real world flying drone, the core curriculum has been augmented with supplemental lessons and Udacity announce that they will provide thorough instructions for the Crazyflie.

 

Udacity is providing on-line learning and their mission is 

“to democratize education through the offering of world-class higher education opportunities that are accessible, flexible, and economical”

We are super happy that Udacity likes the Crazyflie and that more people will have the opportunity to explore the world of robotics!

Crazyflie clients

$
0
0

We though we could use this Monday blog post to do a small state of the Crazyflie clients. What we call a Crazyflie client is a piece of software that connects a Crazyflie and allows to control it and get telemetry back from it. In this post we will concentrate on single-crazyflie client we have on our GitHub page, there exists a lot of libraries and software to control one or many Crazyflies, we will write another blog post about them.

Crazyflie PC client

The Crazyflie PC client, is what we consider the reference client. It supports connecting one Crazyflie using the Crazyradio (PA) dongle or direct USB connection to Crazyflie 2.0. It supports the full Crazyflie telemetry (ie. log), parameters (ie. params) and firmware update. It has support for all the Crazyflie 2.0 deck that can use client support. It is updated each time it is needed when new functionalities are added in the Crazyflie which makes it actively developed and maintained by the community and Bitcraze. A bluetooth link has not been prioritize so far since its multi-platform implementation is non-obvious and bluetooth will introduce some latency and lower the radio bandwidth compared to Crazyradio. However, if anyone would want bluetooth support for the Crazyflie PC client, we welcome contributions :-). The Crazyflie PC client is using the crazyflie-lib-python to communicate with the Crazyflie.

We have three mobile clients on our Github. They have various level of functionality depending on community involvement. Our philosophy is to have the mobile clients at least able to control a Crazyflie, this allows to use them to test Crazyflies without requiring to setup a computer. We will help and support anyone that is interested in adding functionalities to the mobile clients but we generally do not have time to add much functionalities by ourselves.

The Andoid crazyflie client is currently maintained by Fred from the community. It is mobile Crazyflie client with the most feature. It supports both Crazyradio and Bluetooth link. Using Crazyradio it currently supports the part of telemetry and parameter required to support a couple of deck like the led-ring and buzzer deck and supports updating the firmware. Using bluetooth there is currently no telemetry, parameter or firmware update functionality so no deck support. Development is in progress to support more decks and to bring the bluetooth link to the same functionality as the Crazyradio link. The Android client is written in Java and Fred has developed a Crazyflie Java library that is used in the Android client but that can also be used in any other Java program.

 

Crazyflie Android client

The iOS Crazyflie client, works on iPhone and iPad. It supports bluetooth link. It does not have any telemetry or parameter support, so no deck control support. It has firmware update support over bluetooth. It has mainly been developed by me with great contributions from the community for, among others, the port to swift.  The iOS client is written in swift. The Crazyflie and Bluetooth part of the code could be a good starting point if anyone wanted to make a native mac Crazyflie client.

Crazyflie iOS client

Finally we have a prototype of a Windows UWP client developed by theseankelly. It supports Bluetooth low energy. It currently does not supports any telemetry or parameters. It is working both on Windows phone and on Windows 10 on computer, it is currently the only way to connect a Crazyflie using Bluetooth from a laptop. The windows client supports manual control of the Crazyflie using a gamepad or with gesture using HoloLens. This original set of functionality makes it both the most simple and the most advanced Crazyflie client :-).

If you are interested in developing for any of these client, of by making your own, feel free to make a ticket on the relevant github repo or open a thread in the forum. We migh not have much time to develop for the mobile clients, but we will always be glad to help and guide anyone that wants to implement software in relation with the Crazyflie. The Crazyflie clients (running in a computer or phone) and the Crazyflie firmwares (running in the Crazyflie itself) are open source and in active development, it means that is possible to modify both side, this makes it a great target to experiments and to play around with new ideas :-).

 

Android Crazyflie Client status and release

$
0
0

This is a guest blog post written by Fred, the maintainer of the Android Crazyflie client and Java Crazyflie lib.

As a follow-up to last week’s blog post about the different Crazyflie clients, I would like to describe the current status of the Android client in a bit more detail.

After more than a year, Version 0.7.0 of the Android client has been released last Friday (March 16). Most importantly this release fixes a very annoying UI bug that appeared on newer Android versions, where the on-screen joystick did not show up when the app was started for the first time. It also adds support for height hold mode when using the zRanger or Flow deck, it adds a console view (can be enabled in Settings -> App settings -> Show console) and also shows the link quality for BLE connections now. You can read the full changelog on Github. You can find the release in the Google Play Store and as an APK on GitHub.

Connection quality and console messages now work on a BLE connection

Apart from the obvious/visible new features and bug fixes, quite a lot has happened “under the hood”. Some parts of the code were cleaned up, refactored, decoupled, etc. This is still a work in progress though.

There is still plenty of stuff to do for future releases, especially in the realm of Bluetooth support. On the short list are:

  • Param/Log support for BLE connections
  • bootloading over BLE
  • support for Flow deck sequences

Admittedly there was almost no documentation for the Android client and some features are hidden (too well). In an effort to change that, I’ve started to document some features on the project’s Github wiki.

If you find bugs in the app, want to request a feature or see errors in the documentation, please open a GitHub issue.

If you are interested in the development of the Crazyflie Android client and want to get involved, let me know. The fastest way to get new features added or bugs fixed is to contribute a pull request.

Last but not least, I’d like to thank the Bitcraze team for creating and developing the Crazyflie and amazing new decks. Maintaining the Crazyflie Android client is still one of my favorite past times. :)

Viewing all 661 articles
Browse latest View live