-
Notifications
You must be signed in to change notification settings - Fork 331
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Joint_trajectory_controller should not generate extra waypoints #669
Comments
Let me know if these help clearing up your questions: https://control.ros.org/master/doc/ros2_controllers/joint_trajectory_controller/doc/userdoc.html# http://wiki.ros.org/joint_trajectory_controller/UnderstandingTrajectoryReplacement |
I have started with these docs but has not helped me (may be I didn't understand sth ). But in the mean time I looked into source code and after that It came to me that may be this "interpolation_method" parameter is responsible for this. It is by defaul: spline. So hopefully I can switch off this behavior by changing it to "None" Still I don't understand why this functionality is implemented in this controller as I would expect that it should be a moveIt tool job. As it is by default turned on, so I have not seen this parameter set explicitly in the config file of turtlebot3_manipulation so it has not given me a clue to check its meaning. The second link you provided is about updating the on-going trajectory which is not my case I don't send new trajectory before the previous one has not ended. |
@lmendyk: interpolation as performed by the controller is essentially a 'service' to the action client. It's very often the case the hw controlled by the controller requires denser trajectories (both in time and space) than the trajectory representation (in ROS, but this holds in general I believe) could efficiently represent (or encode for). Interpolation is the go-to strategy when you want to reconstruct a signal with a higher sampling rate than the input provides. In the case of the In essence, this allows it to send smooth motion (as you describe it) to the hw iface at a very high resolution (again: in time and space) even if the input trajectory is (very) sparse. If you have servos which implement their own interpolation, this may not be needed. But in many cases signal continuity is a requirement, and without interpolation, that would require action clients like moveit to send Technically, you could set things up as you describe: let whoever (or whatever) comes up with the trajectory worry about space and time resolution and the controller should not interpolate. With the parameter you found that's possible. In reality however, interpolation is very useful -- the default MoveIt configuration fi typically generates about 20 to 40 points for a simple trajectory -- which is why it's enabled by default. Having written all of that, I don't believe you can conclude that the "Joint_trajectory_controller should not generate extra waypoints". First of all, it doesn't really generate waypoints. The original trajectory is still there. It's just used as input to an interpolation algorithm which is used both for efficiency and for flexibility reasons. |
@gavanderhoorn - thank you for this detailed explanation. It is due to that I'm not experience person and I have started learning trying to understand what moveIt2 does and reading about planning pipelines with plugins which optimize initial trajectories generated by a planner misguided me that this is the system responsible for final smoothness of the trajectory. So may be suggestion to add to the documentation a sentence or two :) that the controller by default does lower level trajectory tuning - (in the sense what is sent to the actuators - "virtual" points :) ) . And may be a clue that it may be not needed for hardware which provides position only command interface thus probably performing speed and acceleration profiling based on sparse trajectory. Once again thank you for your explanation . |
that would perhaps make for an excellent pull request. You are currently in the best position to formulate your thoughts about this in a way newcomers would find understandable. (but please don't use the word tuning, that's not what it's doing)
again: not a conclusion you can draw / statement you can make (without adding more ifs). I work with position only interfaces daily, and those require 125 to 250 Hz sampling rates with continuity requirements up to (and including) jerk, and do not do any velocity or acceleration calculations themselves. This is exactly why the |
May be the problem is deeper than I thought. I checked and the configuration prepared by a robot vendor seems to have unrealistic update rate 100 Hz for the system based on Raspberry Pi with ubuntu without RT kernel. So may be instead of switching off the interpolation mechanizm I must get the update rate right. |
I agree that the joint_trajectory_controller could have a more detailed documentation. Feel free to make a suggestion via a PR where we can discuss this further. If you want to shape the trajectory by yourself (e.g., using Ruckig or other methods like S-shaped/7-step jerk-limited trajectory generators), or want the motor controller doing this job: Just use a forward controller instead of JTC. |
OK, it took me while to get close to satisfying result unfortunately doing it in try and error manner. Finally I have return back to my original problem which was getting smooth movements generated by moveIt stack.
@gavanderhoorn @christophfroehlich - thank you for your comments. As to your suggestion for me to make a PR for better documentation. I’m afraid that I still don’t know much to be able to prepare valuable PR not causing “irritation” 😊 by my ignorance. In fact I have gone further then initially anticipated – My ambition was just using high level tools like moveit not getting to details of the controllers – and concentrate on my may domain of interest – 5G base MEC which could host “brain” applications like MoveIT to control (high level) vie 5G URLLC network the robots (low level controllers run on robot hardware) in close to real time. But I find this subject interesting so I would appreciate you hints where I can find good materials explaining basics for me. Especially if there are some best practices which define:
|
@lmendyk your questions are rather general and not well suited for this issue, please ask you questions on https://robotics.stackexchange.com I'll close this one, feel free to reopen if you have questions to your original problem. |
The Joint_trajectory_controller seems to generate some extra waypoints instead of just executing a defined trajectory. This seems as redundant functionally which overlaps with solutions like MoveIt which should be responsible for generating smooth trajectories with potentially many waypoints to smoot the movement.
Background:
I’m still learning robotics and I’m fighting with jerked movements of turtlebot3 with Open Manipulator see ROBOTIS-GIT/turtlebot3_manipulation#57.
The hardware implements only position command interface. So I decided to “start from basics” and instead of MoveIt2 to generate trajectory, I have created manually the trajectory consisting per each task only with a start and an end point hoping to have a smooth movement done by the hardware based on speed and acceleration profile mechanism
ros2 action send_goal /arm_controller/follow_joint_trajectory control_msgs/action/FollowJointTrajectory
"{trajectory: {joint_names:[joint1, joint2, joint3, joint4], points: [
{positions:[0.0, 0.0, 0.0, 0.0], time_from_start: {sec: 0, nanosec: 0} },
{positions:[1.57, 0.78, -0.78, 0.0], time_from_start: {sec: 1, nanosec: 0} }
]} }"
But what I observed that for this trajectory (just start and end point) the movement was very jerked instead one consistent movement which I expected to be done by the hardware being forwarded just the end position according to the doc (“For command interfaces position, the desired positions are simply forwarded to the joints”)
When I added debugging to TurtleBot3ManipulationSystemHardware ::write() I have discovered that the Joint_trajectory_controller has issued extra waypoints (it was 4 waypoint counting the start one) which caused the robot arm to brake and start at these additional waypoints. So instead of smoothing it added the jerks to the arm moments.
To my understanding (still limited) that behavior is strange as the controller should just execute the trajectory and should not try to “smooth” the movement by adding extra waypoints. As I understand that type of logic should be reserved to higher level system like moveIt.
When I changed the trajectory to just one point:
ros2 action send_goal /arm_controller/follow_joint_trajectory control_msgs/action/FollowJointTrajectory
"{trajectory: {joint_names:[joint1, joint2, joint3, joint4], points: [
{positions:[1.57, 0.78, -0.78, 0.0], time_from_start: {sec: 0, nanosec: 0} }
]} }"
The movement was smooth as the end position being just forwarded to the hardware which was responsible to do the smooth movement according to the profile mechanism.
Am I missing something and this behavior of the controller is the proper one?
BWT – what is a proper way of getting “smooth” movements for the robot arms which provide position only command interface. I wonder if an alternative to forcing one step trajectories as I describe above I could do sth which seems contradiction common sense. I mean lowering the max speed of the joint motors and make them slow enough so the movements can’t be done before next waypoint is generated by the controller so the override mechanism could assure movement smoothness for multi-point trajectory.
I would appreciate any comments and suggestions.
The text was updated successfully, but these errors were encountered: