Computing Gait’s Revisited

March 22, 2011 Leave a comment

The latest and final solution we have implemented for the calculation of Gait generation involves the following picture:

These graphs depict duty cycles of various gaits. The black represents the percent of the foot touching the ground, and the rest represents time in the air. The way I chose to go about creating a solution was to visualize an arc being made. An arc in 3D space is represented by a height, width, and length. These are the parameters my simulation takes in (as well a speed). Previously I created functions to express the motion of a gait within an animal, yet found it to be troublesome for precise foot placement.

The following explains the gait algorithm implemented in our robot. We start by creating a gait loop which represents going through a gait from 0% to (final cycle+duty cycle)%. The duty cycle is represented by the max percentage of time the foot is spent off the ground and the final cycle is the percentage at which the last foot in the series leaves the ground. So as an example, in the bound gait the duty cycle is approximately 35% and the final cycle is approximately 70%- hence the loop will be from 0% to 105%. Lastly we just need to create an offset for each leg, the the first contact with the ground for the foot. Again, looking at the bound gait- LF and RF offset’s will be 0% and RH and LH will be 38%.

As you iterate through the loop, there will be a time when the leg is off the ground, less than the offset and greater than the offset+duty cycle, 0% to 38% and 70% to 105%, where the leg will be in the air. 38% to 70% it will be on the ground. To calculate the desired x, y, z of the foot going through the gait I do something similar to following:

//Leg 1 Motion Planning
/*
i is the iterator or current percentage in the cycle
i0 is the "phaser", or the "angle" within the arc
Fl, Forward locomotion is a boolean that is 1 when the robot should be moving forward, 0 when it should be stopped. Vl is vertical, Hl is height- same concept.
Desiredx is initialized to where the foot is initially in your system along the X axis, Desiredy is along the Y axis, and Desiredz is along the Z axis.
length, width, and height are how much range of motion you want.
maxi is the offset+duty (total cycle time)
*/
if ((i > offset) && (i <= (offset + duty))) { i0 = (i - offset) * (360 / duty); //Where in the phase you are as you iterate through the gate (0 -> 360 degrees)
Desiredx = Desiredx + Fl * width * (1 / duty); // Calculates the desired x
Desiredy = Desiredy + Vl * length * (1 / duty));
Desiredz = Desiredz - ( Hl * sin(i0 * PI / 180) * 2 / duty * height;
}

As I iterate through the gait, I then solve for the inverse kinematics of the system, which give me angles necessary for each ‘link’ to get to to reach each desired x,y, and z. The hip is more complicated because we are using a linear actuator, so some math is involved to calculate what position the motor must achieve to move the actuator out far enough to tilt the entire leg the necessary amount (trig required!). You can see results of this down below in the gallery.

After creating the MATLAB simulation I showed prior that allowed me to calculate positions, velocities, accelerations, and torques of Sabertooth, I noticed that the processing was beginning to take a heavy toll on the CPU. The graphical visualization of how the system moves was a phenomenal help for testing. It allowed us to change variables concerning the stride length, width, and height (how much the end effector was allowed to move per gait cycle). It allowed us to actually walk around while changing the speed and orientation of the robot.

However, the lag in this simulation was beyond anticipated and unfortunately rendered the simulation window to not be used as the “controller” for the robot that would represent our robot’s orientation as it walked in 3D.

As a replacement we have created a C++ program which will work alongside the ROS package(used for path planning, Lidar, IMU, and communication with the NI sbRIO) that we will be enabling on the ASUS eeePC. Matlab will still be an important part of logging and understand the enormous amount of data our robot will output because it will allow us to plot the data.

Currently the C++ Program is used for gait generation, and inverse kinematics – which will enable us to (almost) instantaneously calculate gaits and solve what position each motor must be at by the microsecond- allowing for smooth paths for our legs. The C++ also logs all the calculations, and a MATLAB script is used to interpret the 1000s of numbers logged.

To understand the following data, we must understand that Link 1(attached to the body) moves the entire leg laterally, Link 2 moves the leg(thigh and ankle) tangentially to the body, and Link 3 moves the ankle tangentially to the body.

The following is data plotted for the C++ logging files on different gaits our robot can test(hopefully):

Advertisements

3D Mapping and Path Planning

March 21, 2011 Leave a comment

The team has completed the necessary steps in order for the robot to successfully navigate through a semi-unknown environment. We made a couple of key assumptions before developing our path planning. First, we will assume height clearance for the robot – this allows for a simplification of the path planning, which will be explained later. Second, we will assume a semi-static environment; to the best of our ability, we will limit the interference of outside influences on the environment. Thirdly, we will assume a semi-unknown environment; we will have a pre-constructed map containing data about walls and exits for our robot, but the robot itself will have to verify the map and detect any unknown obstacles.

The first step was, of course, to gather 3D data. This was accomplished through the use of a 2D LIDAR mounted on a panning servo, giving us 3 axes of information. The LIDAR we are using is the Hokuyo UTM-30LX, with the following specs:

Specifications
Voltage 12.0V +/-10%
Current 0.7A (Rush current 1.0A)
Detection range 0.1m to approximately 60m (<30m guaranteed)
Laser wavelength 870nm, Class 1
Scan angle 270°
Scan time 25msec/scan (40.0Hz)
Angular resolution 0.25°
Interface USB 2.0
Weight 8.2 oz (233 gm)

 

The sofware interface with the LIDAR as well as the tools to plot the data was done through the Robotics Operating System (www.ros.org). Below is a sample scan of the raw data obtained a pan of the LIDAR:

The next step was to filter the data. The filter applied was for a noise below a minimum range and for a general noise reduction. The LIDAR has a minimum range of 0.1 meters, therefore any data obtained below that range is essentially useless. The general noise filter was a low-pass filter applied for extraneous data caused by any number of factors. Below is the filtered image of the above data:

Now that we had filtered, useful 3D data, we made the aforementioned assumption of height clearance. This assumption allowed us to compress the 3D map into a 2D map and treat it as such, because we no longer would have to worry about overhanging objects. We compress the map in a manner such that looking at the 2D map is like looking at the 3D map from a top-down point of view (i.e. standing over the 3D map). Simultaneous to compressing the 3D map, we transform the data from the map into a 2D occupancy grid representation of the environment. Our grid cells were sized 10cm x 10cm, with each cell marked as either ‘occupied’ or ‘unoccupied’. The occupancy grid allows us to much more accurately path plan around the map. For comparison, both the top-down view of the above 3D map (top) and the 2D occupancy grid (bottom) are shown below:

The final step is the actual path planning. The team decided to implement an A* search algorithm for the main reason that A* is guaranteed to be complete and optimal, assuming an admissible heuristic is used. We also considered D* (for Dynamic A*), but ultimately decided that the benefits of D* do not apply for our constraints and assumptions. The results of the initial A* implementation is shown below (for demonstration purposes, the end point was chosen to be the opposite side of the map):

The only problem with typical A*, however, is that it does not account for the dimensions of the object moving through the path.  Therefore, the above path is a false representation of the path the robot could actually take. To account for this, the team discussed two options – first, that we could alter the A* algorithm to make sure that all cells within a certain radius of an unexplored node are unoccupied before exploring it; second, we could alter our occupancy grid to account for the dimensions of the robot. We decided to implement the latter by expanding each obstacle by half of the width of the robot. Since the point that A* is moving through space is the center of the front of the robot, expanding the obstacles this way ensures that that point will always be at least half the width of the robot away from where the wall actually is. After all was said and done, here is the path our algorithm returned – as can be seen, it is both optimal and stays a safe distance away from the obstacles:

Categories: Computer Science

Maxon Motors and Labview sbRIO

March 20, 2011 6 comments

We have completed getting control of the hip of Sabertooth! This will allow us to tilt our legs laterally by ~10 degrees in each direction using a linear actuator.

The following components were used for the task:

  • National Instrument’s sbRIO – 9632xt, Embedded control with FPGA
  • National Instrument’s 9853, 2 Port High Speed CAN Bus with 1 Mbps transfer speeds
  • 4x Maxon EC Max 40, Brushless 70watt motors with incremental encoders
  • 4x EPOS2 70/10, 25 A motor controllers

Utilizing the instrumentation mentioned above- we have managed to obtain full control of our linear actuators we have designed. The end result allows us to send packets of information to the sbRIO with desired positions. The real time processor will calculate the PID loops and pass information to the FPGA which will package it up into frames and send it to the CAN bus and onward to the controllers or ‘nodes’. These packets will arrive at a desired node in the network and will execute the frame.

The Labview 10 provided a great example in the Robotics toolkit with drivers of how to obtain communication with the EPOS2 controller. However there was a significant timeout error that we ran into, error decimal 7801, that we would like to share and explain in case someone runs into this issue in the future.

Read CAN.vi<ERR>
CAN read timed out attempting to read COB ID x581.  The data you are trying to
read from the CAN bus timed out before it was recieved.  Make sure your device is communicating properly.

Though originally when we contacted the NI support team for help (Thanks again Greg Jannaman!), we thought the problem was revolved around the program and the timeout in the Read CAN.vi; NI suggested increasing the timeout loop and that gave us a successful result which allowed us to establish communication with the controllers. However, after raising another issue we were having with the EPOS2 controllers starting in the Error/Disabled state (Red LED) we realized the error was in the CAN communication baudrate. The error the EPOS2 would give us upon starting was always “CAN in Error Passive Mode”, to fix this error we utilized the EPOS Studio and accessed the Controller’s Object Dictionary to change the system parameter “Can Bitrate” from decimal value 0, 1Mbps, to 9, Automatically detect bitrate. This resolved the error from appearing each time the network was started.

We changed the Motor Position Control.vi to utilize 4 motors. The next step is to implement limit switches with the linear actuators which will allow us to home our motors on each initialization. We will also add force and current sensors into each leg to give us more information about the state of our system as well.

 

We’re at an exciting point in our project and we are overwhelmingly excited to be finally assembling the entire robot within the next two weeks! More news and pictures coming VERY SOON!

 

Yarde Metals, Onvio, Bobobots, Venom and Hydro-Cutter Join as new Sponsors

March 9, 2011 Leave a comment

Hi All,

We’ve been hard at work for the last couple weeks but we’d like to announce a few of our newest sponsors.

Yarde Metals is a worldwide distributor and processor of aluminum, stainless, carbon steel, brass, nickel and titanium alloys.  Yarde’s headquarters is located in Southington, CT and has nine service centers located throughout the U.S.  The company has expanded internationally throughout Europe and Asia. They have provided us some  amazing discounts on their metal.

OnvioOnvio is the premier U.S. manufacturer of high precision planetary gearboxes,
cycloidal reducers and timing belt pulleys. For nearly 30 years, Onvio products have served customers throughout the precision automation industries. They manufacture a complete range of standard products and our solutions group can offer unique, innovative solutions specific to your needs. We will be using 8 high precision Onvio gearboxes to power the shoulder and knee joints of our robot.

Bobobots is dedicated to the design, engineering and manufacturing of the highest quality, performance driven brushed DC servo motors.  The power delivered by these motors is comparable by few companies and we are extremely lucky to have Bobobots as a sponsor.

Venom Group International [VGI] is an evolving entrepreneurial story of what can happen when creative minds meet, actions are taken and dreams are pursued. It is a story about the power of an idea, of innovation, of teamwork and unity of purpose, the power of determination and the power of persistence.  Venom has lots of hobby products including hobby motors, batteries, chargers, and even RC cars.

HYDRO-CUTTER provides Precision Waterjet Cutting services to a wide array of industries and companies. We utilize the latest technologies, equipment and processes to cut materials that are difficult to machine or manufacture using traditional methods. Tom Gravel has been  extremely generous donating countless hours of his machine time to help us with some of the manufacturing of this project.

Categories: Uncategorized

Flickr Added!

January 17, 2011 Leave a comment

Hi All,

As we started our building we’ve been taking some good pictures and will be trying to document the manufacturing phase of the project. Look on the right side of this page to see our most recent pictures and click on the links to get some good higher res pictures.

Categories: Uncategorized

Leg System Prototyped

January 15, 2011 Leave a comment

The team has developed a full scale prototype leg from acrylic, MDF and aluminum. The leg is meant to simulate the real leg conditions as accurately as possible. It works through the same cable system and has springs in series with both the motor and the leg. In this video, we test to see whether our spring system is able to store energy and if our cable system will be able to complete the necessary movement.

Categories: Uncategorized

Body Joint Prototype Complete!

January 14, 2011 Leave a comment

Over the past couple weeks we have been heavily prototyping some of the more experimental systems. Specifically, we have been looking at prototyping a full scale model of the central joint of the robot. The central body joint of the Sabertooth robot is designed to allow for two passive degrees of freedom. The robot thanks to the passive joint is able to complete both a roll and yaw movement but with a predirectional bias towards the center. This video shows a prototype made out of 3D Printed ABS Plastic Parts and lasercut laminated MDF and acrylic. The model is full scale. The final version will be designed out of aluminum with steel shafts and bolts.

Categories: Mechanical Aspect