Hi all,
I am following boor Mastering ROS for Robotics Programmin and I am trying to add a 3D camera above my UR5 arm in urdf. But after I modified the ur5.urdf.xacro file by adding . Thank you so much!!!!
[Please find the details of my problem with this link](https://github.com/qboticslabs/mastering_ros/issues/37)
↧
Adding a 3D camera to a robot arm(ur5) in URDF for Rviz and Gazebo
↧
Saving geotiff map in hector_slam
Hey everyone
I'm currently working on a project that involves using hector slam with a lidar to map its surroundings. At the moment I have hector slam building maps really well so now I'm trying to save these maps every few seconds during the mapping process as geotiff files.
I'm running ROS Kinetic on a Raspberry Pi with Ubuntu Mate. I have created a directory called "hector_geotiff/maps" in my home directory in which to save these images. This directory is being referenced in the launch files below. While the mapping process is running I am using the command
rostopic pub syscommand std_msgs/String "savegeotiff"
Doing this yields no error messages and gives the output "publishing and latching message. Press ctrl-C to terminate" This gives me the impression that its working and trying to save the files however there are no images appearing in the given directory.
I have attached my launch files for tutorial.launch, mapping_default.launch & geotiff_mapper.launch below as well as the output for rostopic list and rosnode list.
Tutorial.launch
Geotiff_mapper.launch
Mapping_default.launch
rosnode list
/base_to_laser_broadcaster
/hector_geotiff_node
/hector_mapping
/hector_trajectory_server
/rosout
/rplidarNode
/rviz
rostopic list
/clicked_point
/initialpose
/map
/map_metadata
/map_updates
/move_base_simple/goal
/poseupdate
/rosout
/rosout_agg
/scan
/slam_cloud
/slam_out_pose
/syscommand
/tf
/tf_static
/trajectory
↧
↧
Extract specific Data from topic
Hello there,
I have a node in ros publshing data of type Measurement2 which is used for RN Sensors. The problem wit the Datatype is that it publishes several different Values like you can see here:
header:
seq: 36
stamp:
secs: 1531906243
nsecs: 20823955
frame_id: "wifi_adapter"
device_name: "simulated"
device_designation: "wifi_adapter"
classification: "W"
values:
-
sensor: "Elrob-RN-1"
source: "wifi signal strength"
type: "wifi signal strength"
value: [-30.0]
unit: "dBm"
min: 0.0
avg: 0.0
max: 0.0
alert_level: 0
alert_explanation: ''
substance: []
info: "Freq. 2.457 GHz mode: Master address: 08:EA:44:7A:FA:55 quality: 70/70 signal: -30.0"
I want to filter out the value of the signal and frequency from the broadcasted topic and then show it in rviz at the current position of my robot but I havent found a way yet how to extract the data and modify it so I can show it colorcoded on the created map in rviz
edit: thanks for editing the post to look better I somehow didnt get it to look straight
EDIT1:
My setup is a roboter with Laserscanners and RGBD Cameras to create a 2D and 3D map of its surroundings. Now I have to make something like a heatmap to show where strong signals occur on this generated maps. For this I need to extract specific Data from the topicoutput shown above. Especially i need the signalstrength and frequency data from the whole dataset.
↧
Inverse Kinematics on rviz
Hi,
I am recently working on Fetch robot and motion planning and I have run into such a problem:
*Given a serial of pose in work space, the serial of configurations in joint space the IK solver output are not continue (sometime flip).*
I have noticed that when you are dragging the interactive marker on the end-effector in rviz, the joints can always be continue. Does anyone have any idea what kind of IK algorithm rviz is using?
Thanks!
↧
Inverse Kinematics on rviz
Hi,
I am working on Fetch robot and motion planning. I have run into such a problem:
Given a serial of continue end-effector pose in work space, the serial of configuration in joint space are not continue (sometime flip).
I have noticed that when you drag the interactive marker on the end-effector in rviz, the robot's joints can always move in a continue trajectory. Does anyone have any idea what kind of IK algorithms rviz is using?
Thanks a lot!
↧
↧
Running rviz with VNC on a remote computer
I am developing a robot which uses [Odroid XU4](https://www.hardkernel.com/main/products/prdt_info.php) as its onboard computer. It has a headless setup i.e. no monitor, keyboard and mouse. For testing purposes I SSH into the machine, which runs Ubuntu Mate 18.04, and start the `vncserver`. Then using a VNC viewer application, I have remote desktop access to the machine.
Everything runs fine except `rviz`. When I try to run `rviz` (I run a ROS master process in another tab), I get the following error:
odroid@odroid:~/workspace/racecar$ rosrun rviz rviz
bash: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8)
The X11 connection broke: Unsupported extension used (code 2)
XIO: fatal IO error 22 (Invalid argument) on X server ":1"
after 6 requests (6 known processed) with 0 events remaining.
I have seen solutions where Rviz is run on the local machine and all other nodes are run on remote computer. However my local machine is a Mac which does not have a ROS installation. Is there any way I can get Rviz to run on the remote computer?
↧
[MoveIt/Gazebo] Error Controller uvbot_v1/arm_controller failed with error code GOAL_TOLERANCE_VIOLATED
When Executing path on Rviz I get failed error with the following warnings in the terminal. How do i change the tolerance?
I have changed the goal tolerance margin in Rviz gui but then also i get the same error.
Do i need to change the mass value or friction and damping value in th URDF?
[ WARN] [1535093077.556047667, 156.816000000]: Controller uvbot_v1/arm_controller failed with error code GOAL_TOLERANCE_VIOLATED
[ WARN] [1535093077.556170194, 156.816000000]: Controller handle uvbot_v1/arm_controller reports status ABORTED
[ INFO] [1535093077.632925092, 156.884000000]: Completed trajectory execution with status ABORTED ...
[ INFO] [1535093077.722510041, 156.968000000]: ABORTED: Solution found but controller failed during execution
↧
RGBDSLAMv2 remote online mapping visualization problems in RViz
Hi,
as I already mentioned in earlier posts, I am attempting to remotely build a map in real-time with RGBDSLAM over Wifi (if possible). I have Raspberry Pi 3 on my robot with Kinect XBOX 360 from which I am streaming RGB and depth images while on my workstation side I run RGBDSLAM.
I managed to successfully build a map with GUI enabled, but in case of headless mapping after using rosservice calls
rosservice call /rgbdslam/ros_ui_b pause false
rosservice call /rgbdslam/ros_ui send_al
and subscribing to `/rgbdslam/batch_clou`d topic in Rviz I can only visualize the point cloud of environment in the direction that camera is facing atm.
I want to visualize an incrementally building of map in RViz like it is done in RGBDSLAM GUI.
I noticed that I get "No Map received" warning even if the map node is published and part of /tf. My tf look like [this](https://imgur.com/a/N4CyxMd). In RViz tf is "jumping around" the grid so I suspect something is not right with it even no error/warning is produced.
Shouldn't the transform be with /map as the root of tree like `/map --> /odom --> /base_link --> /camera_link --> ...`
I can't seem to get it right even after fideling with params some time. The current setting is like this:
Can somebody guide me how to resolve this issues? It would be much appreciated.
↧
[Kinetic] mapping is inaccurate
Hi all I created my own launch file for gmapping however the map generated in RVIZ is really inaccurate compared to the Gazebo world. Is there any solution to this? My robot's speed was at 10 on the keyboard teleop and rotation was done in slow movements. My launch file codes are as follows:
File: my_gmapping_launch.launch
File: gmapping_params.yaml
base_frame: base_footprint
odom_frame: odom
map_update_interval: 5.0
maxUrange: 6.0
maxRange: 8.0
minimumScore: 200
linearUpdate: 0.1
angularUpdate: 0.1
temporalUpdate: -1.0
resampleThreshold: 0.5
particles: 80
lskip: 10
xmin: -10
ymin: -10
xmax: 10
ymax: 10
delta: 0.05
llsamplerange: 0.01
llsamplestep: 0.01
lasamplerange: 0.005
lasamplestep: 0.005
RVIZ mapping result and RVIZ settings -



Gazebo world -

After creating my map I'll be running my amcl launch file to localize my robot in the map.
↧
↧
rviz imported map resize for navigation
Hello,
I'm beginner !
I have a homemade 2d map in png file format. I didn't build this map with Rplidar or something.
I can import the file into rviz, but it seems that the size of my map is way too big (in dimension) compared to what my turtlebot is watching.
So when I run some navigation, my robot moving very little on my map (in rviz), it looks like the map is bigger than it should be.
How can I resize this or configure the projection ? I couldn't find anything about this issue.
Run command:
$ roslaunch turtlebot_navigation amcl_demo.launch map_file:=/home/turtlebot/Maps/office_bitmap.yaml
The yaml file:
image: /home/turtlebot/Maps/office_bitmap.png
resolution: 0.05
origin: [-25.624998, -25.624998, 0.000000]
negate: 0
occupied_thresh: 0.65
free_thresh: 0.196
↧
RViz Orbit set different rotation axis
I am using the Orbit view in RViz with a target frame (`world_frame`) that is different than my global fixed frame (`camera_rgb_optical_frame`):

When moving the mouse, I want the view to rotate around the Z-Axis of the target frame (`world_frame`) at its origin. However, the Orbit view is using the global frame's (`camera_rgb_optical_frame`) Z-Axis, which more or less has the same direction as the target frame's Y-Axis. This is also indicated by the yellow disc at the target frame.
How do I change the orientation of this disc, i.e. how can I make the Orbit view to use the Z-Direction of the target frame instead of the Z-Direction of the global frame?
↧
Load two urdf models in one launch file
Hello,
i'm trying to load two models in one lunch file. I have one file for a robot arm and one for the base, that is intended to carry the robot arm around. Each separated works but as i try to load them together only the last model is loaded. Basically that's not surprising for me since i load both with the "robot_description" parameter and therefor overwrite the first loaded model. But how is it done the correct way? As i name the descriptions different, rviz complains about not having a properly set "robot_description" parameter. Keeping one "robot_description" and naming the other different leads to a white model with TF complaining about not having a transform tor every of the links. So, how to load both models from one launch file?
Thanks to everyone.
[/move_group/fake_controller_joint_states]
↧
Hector Quadrotor Demo: Take Off Not Working.
Hello,
I am trying to use the `hector_quadrotor_demo`, but the UAV is not taking off.
I followed the [wiki](http://wiki.ros.org/hector_quadrotor/Tutorials/Quadrotor%20outdoor%20flight%20demo "Quadrotor outdoor flight demo") and did:
$ roslaunch hector_quadrotor_demo outdoor_flight_gazebo.launch
After which, the link above says:
> gazebo simulation as well as rviz visualization should start up now, and the quadrotor UAV should be on the ground ready for takeoff.
But when I do:
$ rosservice call /enable_motors true
$ rostopic pub -r 10 /cmd_vel geometry_msgs/Twist '{linear: {x: 0, y: 0.0, z: 1.0}, angular: {x: 0.0,y: 0.0,z: 0.0}}'
The UAV does nothing; does not take off.
On the console, I have the following:
[INFO] [1536635465.572972, 0.000000]: Controller Spawner: Waiting for service controller_manager/load_controller
(...)
[WARN] [1536635495.736880, 0.001000]: Controller Spawner couldn't find the expected controller_manager ROS interface.
[controller_spawner-6] process has finished cleanly
(...)
[ INFO] [1536633874.045966140, 1.021000000]: Waiting for position state to be available before starting /action/pose server
[ INFO] [1536633912.570868246, 2.021000000]: Waiting for position state to be available before starting /action/pose server
Which, I do not know if it's related to the UAV not moving.
Thank you in advance.
I'll also leave the whole output since I start the launch file, as there are some warning which I do not know if they are related.
process[rosout-1]: started with pid [15785]
started core service [/rosout]
process[gazebo-2]: started with pid [15801]
process[gazebo_gui-3]: started with pid [15814]
process[robot_state_publisher-4]: started with pid [15817]
process[ground_truth_to_tf-5]: started with pid [15820]
process[controller_spawner-6]: started with pid [15821]
[ WARN] [1536635464.538843876]: The root link base_link has an inertia specified in the URDF, but KDL does not support a root link with an inertia. As a workaround, you can add an extra dummy link to your URDF.
process[estop_relay-7]: started with pid [15839]
process[pose_action-8]: started with pid [15847]
process[landing_action-9]: started with pid [15864]
process[takeoff_action-10]: started with pid [15870]
process[spawn_robot-11]: started with pid [15879]
process[rviz-12]: started with pid [15896]
[ INFO] [1536635464.891581683]: waitForService: Service [/enable_motors] has not been advertised, waiting...
[WARN] [1536635465.572509, 0.000000]: DEPRECATION warning: --shutdown-timeout has no effect.
[INFO] [1536635465.572972, 0.000000]: Controller Spawner: Waiting for service controller_manager/load_controller
[ INFO] [1536635466.626203711]: Finished loading Gazebo ROS API Plugin.
[ INFO] [1536635466.658819532]: Finished loading Gazebo ROS API Plugin.
[ INFO] [1536635466.682518696]: waitForService: Service [/gazebo/set_physics_properties] has not been advertised, waiting...
[ INFO] [1536635466.698010678]: waitForService: Service [/gazebo/set_physics_properties] has not been advertised, waiting...
SpawnModel script started
[INFO] [1536635467.486189, 0.000000]: Loading model XML from ros parameter
[INFO] [1536635467.489836, 0.000000]: Waiting for service /gazebo/spawn_urdf_model
[INFO] [1536635474.428590, 0.000000]: Calling service /gazebo/spawn_urdf_model
[ INFO] [1536635481.476748863, 0.001000000]: waitForService: Service [/enable_motors] has not been advertised, waiting...
[ INFO] [1536635481.481839312, 0.001000000]: waitForService: Service [/enable_motors] has not been advertised, waiting...
[WARN] [1536635495.736880, 0.001000]: Controller Spawner couldn't find the expected controller_manager ROS interface.
[controller_spawner-6] process has finished cleanly
log file: (...)
[INFO] [1536635507.757311, 0.001000]: Spawn status: SpawnModel: Successfully spawned entity
[spawn_robot-11] process has finished cleanly
log file: (...)
[ INFO] [1536635508.155567834, 0.001000000]: imu plugin missing , defaults to 0s
[ INFO] [1536635510.862780756, 0.001000000]: Loading gazebo_ros_control plugin
[ WARN] [1536635510.892357204, 0.001000000]: Desired controller update period (0.010000000 s) is slower than the gazebo simulation period (0.001000000 s).
[ INFO] [1536635510.892919175, 0.001000000]: Starting gazebo_ros_control plugin in namespace: /
[ INFO] [1536635510.924941575, 0.001000000]: gazebo_ros_control plugin is waiting for model URDF in parameter [robot_description] on the ROS param server.
[ INFO] [1536635511.556073000, 0.001000000]: Limits /wrench_limits/force/z initialized z with min 0 and max 30
[ INFO] [1536635511.599553708, 0.001000000]: Limits /wrench_limits/torque/x initialized x with min -10 and max 10
[ INFO] [1536635511.628548891, 0.001000000]: Limits /wrench_limits/torque/y initialized y with min -10 and max 10
[ INFO] [1536635511.657604590, 0.001000000]: Limits /wrench_limits/torque/z initialized z with min -1 and max 1
[ INFO] [1536635511.778303860, 0.001000000]: Loaded gazebo_ros_control.
Loaded the following quadrotor propulsion model parameters from namespace /quadrotor_propulsion:
k_m = -7.01163e-05
k_t = 0.0153369
CT2s = 0
CT1s = -0.00025224
CT0s = 1.53819e-05
Psi = 0.00724218
J_M = 2.57305e-05
R_A = 0.201084
l_m = 0.275
alpha_m = 0.104864
beta_m = 0.549262
Loaded the following quadrotor drag model parameters from namespace /quadrotor_aerodynamics:
C_wxy = 0.12
C_wz = 0.1
C_mxy = 0.0741562
C_mz = 0.0506433
[ INFO] [1536635513.032994905, 0.021000000]: waitForService: Service [/gazebo/set_physics_properties] is now available.
[ INFO] [1536635513.035595702, 0.021000000]: waitForService: Service [/enable_motors] is now available.
[ INFO] [1536635513.037742759, 0.021000000]: waitForService: Service [/enable_motors] is now available.
[ INFO] [1536635513.042112456, 0.021000000]: waitForService: Service [/gazebo/set_physics_properties] is now available.
[ INFO] [1536635513.050777041, 0.021000000]: waitForService: Service [/enable_motors] is now available.
[ INFO] [1536635513.436044999, 0.025000000]: Physics dynamic reconfigure ready.
[ INFO] [1536635513.571756934, 0.032000000]: Physics dynamic reconfigure ready.
[ INFO] [1536635547.856579762, 1.021000000]: Waiting for position state to be available before starting /action/pose server
[ INFO] [1536635582.537188882, 2.021000000]: Waiting for position state to be available before starting /action/pose server
[ INFO] [1536635616.124352615, 3.021000000]: Waiting for position state to be available before starting /action/pose server
[ INFO] [1536635651.745664328, 4.021000000]: Waiting for position state to be available before starting /action/pose server
[ INFO] [1536635684.933580989, 5.021000000]: Waiting for position state to be available before starting /action/pose server
[ INFO] [1536635722.063089350, 6.022000000]: Waiting for position state to be available before starting /action/pose server
[ INFO] [1536635759.828425936, 7.121000000]: Waiting for position state to be available before starting /action/pose server
↧
↧
How to configure move_base properly
I am trying to run topological_navigation (documentation here https://github.com/strands-project/strands_navigation) on a Toyota HSR robot which I need a running instance of move_base for. I've run move base using the command `rosrun move_base move_base` but when I give my robot a navigation goal I get the error saying "The robot's starts position is off the global cost map. Planning will always fail, are you sure the robot has been properly localized?" even though I have properly localized the robot. Any help would be much appreciated.
↧
VR Headset for ROS
Hey!
We think about buying a VR Headset. Partially for the lulz, but it also would be great to be able to use it within RVIZ or Gazebo so you walk around your robot before it is built. AFAIR the Vive is not supported on Linux, and there is already an [rviz-plugin](https://github.com/ros-visualization/oculus_rviz_plugins/issues) for the rift, but that project looks a bit dead.
Is anyone using a headset currently?
↧
Get data from rviz
Hi all, so today I did my first simple work on rviz - projecting depth image data onto the plane using the pointcould display. I am wondering whether theres is way to get the final graph display data from rviz and probably later reproduce the same plot in say MATLAB. I know I can get the depth/image/points message by simply echoing out the topic.
If so, how easiest can I achieve this? Thank you all very much.
↧
Trouble visualizing PointCloud2 in Rviz
Hello,
I publish a PointCloud2 message to a predefined topic that rviz's PointCloud display subscribes to (/output) successfully, but the pointcloud does not show up on the visualization. I know it publishes successfully because there is no error message in the rviz display window.
In the launch file for my program, I launch the demo.launch for my tx90 robot that was created through the moveit_setup_assistant,
which starts up rviz which visualizes my tx90 robot, but not the pointcloud2 message.
i believe this may be a problem with TF, but i added a static_transform_publisher to my launch file as so:
But this does not change anything except add the frame head to my transform tree (which I found through using tf view_frames), I am still not able to view the pointclouds in rviz. It says I do not have enough karma to post my frames.pdf file.
My code for this portion is as follows:
ros::Publisher pc_pub = n.advertise ("/output", 1);
boost::filesystem::path pkgPath, imgPath;
std::stringstream ss;
pkgPath = ros::package::getPath("mover_node");
ss << pkgPath.c_str() << "/Wound_3d_mesh.pcd";
std::string fpath = ss.str();
sensor_msgs::PointCloud2 cloud_msg;
pcl::PointCloud cloud;
pcl::PCLPointCloud2 point_cloud2;
pcl::io::loadPCDFile(fpath, cloud);
//pcl::PointCloud::ConstPtr tmp_cloud = cloud;
pcl::toPCLPointCloud2(cloud, point_cloud2);
pcl_conversions::fromPCL(point_cloud2, cloud_msg);
cloud_msg.header.frame_id = "head";
cloud_msg.header.stamp = ros::Time::now();
pc_pub.publish(cloud_msg);
↧
↧
No transform from [base_link] to frame [map]
Followed this [TF ROS tutorial](http://wiki.ros.org/tf/Tutorials/Introduction%20to%20tf)
**Incurred error:** 
TF status is 'warn'. This reads three error messages:
1. No transform from [turtle1] to frame [map]
2. No transform from [turtle2] to frame [map]
3. No transform from [world] to frame [map]
I've tried( one at once ) these commands, viz.,
$ rosrun tf static_transform_publisher 0.0 0.0 0.0 0.0 0.0 0.0 turtle1 turtle2 100
$ rosrun tf static_transform_publisher 0.0 0.0 0 0.0 0.0 0.0 turtle1 turtle2 world
$ rosrun tf static_transform_publisher 0.0 0.0 0 0.0 0.0 0.0 turtle1 turtle2 world 100
to resolve this, while this command is running on one of the terminals
$ rosrun tf static_transform_publisher 0.0 0.0 0.0 0.0 0.0 0.0 map my_frame 100
I've also followed #q10762, but didn't help in this case.
What am I missing in here? Please correct and clarify.
Any constructive discussions or leads will be much appreciated. Thanks in advance.
↧
CE30-C Benewake solid state LiDAR usage
I've CE30-C Benewake LiDAR sensor, ros kinetic(catkin), Ubuntu 16.04 LTS.
ROS drivers for this sensor: [link text](https://github.com/zlite/benewake/tree/master/CE30-C_ROS-master)
I didn't find any proper documentation or tutorials of this sensor' usage, running, 3D point clouds visualization, mapping and localization on the internet.
Please help me with these tasks:-
1. Usage and running this sensor.
2. LiDAR data visualization in rviz.
3. 3D mapping using this sensor in rviz.
4. Implementation of localization using this sensor.
Thanks in advance.
↧
ERROR: cannot launch node of type [rviz/rviz]: can't locate node [rviz] in package [rviz]
Hi guys, I'm using raspberry pi 3B to run rplidar and show the scanning result through rviz, however, I follow the github rplidar lesson but when I execute "roslaunch rplidar_ros view_rplidar.launch" this command, it always show up **"ERROR: cannot launch node of type [rviz/rviz]: can't locate node [rviz] in package [rviz]**", therefore, I try to do "rviz rviz" to check rviz, then it can open rviz successfully.
Also I'v tried 1.roslaunch rplidar_ros rplidar.launch 2.rosrun rplidar_ros rplidarNodeClient, it worked successfully to read the information form lidar and show on terminal.
Please help me :( thanks in advance!
↧