PyNiryo API Documentationο
TCP clientο
- pyniryo.api.tcp_client.get_deprecation_msg(old_method, new_method)ο
- class pyniryo.api.tcp_client.NiryoRobot(ip_address=None, verbose=True)ο
- Parameters:
ip_address (str) β IP address of the robot
verbose (bool) β Enable or disable the information logs
- connect(ip_address)ο
Connect to the TCP Server
- Parameters:
ip_address (str) β IP Address
- Return type:
None
- close_connection()ο
Close connection with robot
- Return type:
None
- calibrate(calibrate_mode)ο
Calibrate (manually or automatically) motors. Automatic calibration will do nothing if motors are already calibrated
- Parameters:
calibrate_mode (CalibrateMode) β Auto or Manual
- Return type:
None
- calibrate_auto()ο
Start a automatic motors calibration if motors are not calibrated yet
- Return type:
None
- need_calibration()ο
Return a bool indicating whereas the robot motors need to be calibrate
- Return type:
bool
- get_learning_mode()ο
Get learning mode state
- Returns:
True
if learning mode is on- Return type:
bool
- property learning_modeο
Property Get learning mode state
Example:
if robot.learning_mode: print("Torque off") else: print("Torque on")
- Returns:
True
if learning mode is on- Return type:
bool
- set_learning_mode(enabled)ο
Set learning mode if param is
True
, else turn it off- Parameters:
enabled (bool) β
True
orFalse
- Return type:
None
- set_arm_max_velocity(percentage_speed)ο
Limit arm max velocity to a percentage of its maximum velocity
- Parameters:
percentage_speed (int) β Should be between 1 & 100
- Return type:
None
- jog_control()ο
Context manager to enable jog control mode during a block of code
Example:
with robot.jog_control(): robot.jog(JointsPosition(0.1, 0.0, 0.0, 0.0, 0.0, 0.0))
- set_jog_control(enabled)ο
Set jog control mode if param is True, else turn it off
- Parameters:
enabled (bool) β
True
orFalse
- Return type:
None
- static wait(duration)ο
Wait for a certain time
- Parameters:
duration (float) β duration in seconds
- Return type:
None
- property collision_detectedο
True if a collision has been detected during a previous movement
- Type:
bool
- clear_collision_detected()ο
Reset the internal flag
collision_detected
- get_joints()ο
Get joints value in radians
- Returns:
Robotβs current joints position
- Return type:
- get_pose()ο
Get end effector link pose. x, y & z are expressed in meters / roll, pitch & yaw are expressed in radians
- Return type:
- get_pose_quat()ο
Get end effector link pose in Quaternion coordinates
- Returns:
Position and quaternion coordinates concatenated in a list : [x, y, z, qx, qy, qz, qw]
- Return type:
list[float]
- get_pose_v2()ο
Get end effector link pose. x, y & z are expressed in meters / roll, pitch & yaw are expressed in radians
- Return type:
- property jointsο
Robotβs current joints position
- Type:
- move_joints(*args)ο
Deprecated since version 1.2.0: You should use
move()
with aJointsPosition
object.Move robot joints. Joints are expressed in radians.
All lines of the next example realize the same operation:
robot.joints = [0.2, 0.1, 0.3, 0.0, 0.5, 0.0] robot.move_joints([0.2, 0.1, 0.3, 0.0, 0.5, 0.0]) robot.move_joints(0.2, 0.1, 0.3, 0.0, 0.5, 0.0)
- Parameters:
args (Union[list[float], tuple[float]]) β either 6 args (1 for each joints) or a list of 6 joints
- Return type:
None
- property poseο
Get end effector link pose. x, y & z are expressed in meters / roll, pitch & yaw are expressed in radians
- Type:
- property pose_v2ο
Get end effector link pose. x, y & z are expressed in meters / roll, pitch & yaw are expressed in radians
- Type:
- move_pose(*args)ο
Deprecated since version 1.2.0: You should use
move()
with aPoseObject
object.Move robot end effector pose to a (x, y, z, roll, pitch, yaw, frame_name) pose in a particular frame (frame_name) if defined. x, y & z are expressed in meters / roll, pitch & yaw are expressed in radians
All lines of the next example realize the same operation:
robot.pose = [0.2, 0.1, 0.3, 0.0, 0.5, 0.0] robot.move_pose([0.2, 0.1, 0.3, 0.0, 0.5, 0.0]) robot.move_pose(0.2, 0.1, 0.3, 0.0, 0.5, 0.0) robot.move_pose(0.2, 0.1, 0.3, 0.0, 0.5, 0.0) robot.move_pose(PoseObject(0.2, 0.1, 0.3, 0.0, 0.5, 0.0)) robot.move_pose([0.2, 0.1, 0.3, 0.0, 0.5, 0.0], "frame") robot.move_pose(PoseObject(0.2, 0.1, 0.3, 0.0, 0.5, 0.0), "frame")
- Parameters:
args (Union[tuple[float], list[float], PoseObject, [tuple[float], str], [list[float], str], [PoseObject, str]]) β either 7 args (1 for each coordinates and 1 for the name of the frame) or a list of 6 coordinates or a :class:`PoseObject` and 1 for the frame name
- Return type:
None
- move_linear_pose(*args)ο
Deprecated since version 1.2.0: You should use
move()
with aPoseObject
object and move_cmd=Command.MOVE_LINEAR_POSEMove robot end effector pose to a (x, y, z, roll, pitch, yaw) pose with a linear trajectory, in a particular frame (frame_name) if defined
- Parameters:
args (Union[tuple[float], list[float], PoseObject, [tuple[float], str], [list[float], str], [PoseObject, str]]) β either 7 args (1 for each coordinates and 1 for the name of the frame) or a list of 6 coordinates or a :class:`PoseObject` and 1 for the frame name
- Return type:
None
- move(robot_position, linear=False)ο
Move the robot to the given position. The position can be expressed in joints or in pose coordinates. Distances are expressed in meters, and angles are expressed in radians. If a move pose is performed, move_cmd can be specified to move the robot according to the command
Examples
robot.move(PoseObject(0.2, 0.1, 0.3, 0.0, 0.5, 0.0, metadata=PoseMetadata.v2(frame="frame"))) robot.move(PoseObject(0.2, 0.1, 0.3, 0.0, 0.5, 0.0), linear=True) robot.move(JointsPosition(0.2, 0.1, 0.3, 0.0, 0.5, 0.0))
- Parameters:
robot_position (Union[PoseObject, JointsPosition]) β either a joints position or a pose
linear (bool) β do a linear move (works only with a PoseObject)
- Return type:
None
- shift_pose(axis, shift_value, linear=False)ο
Shift robot end effector pose along one axis
- Parameters:
axis (RobotAxis) β Axis along which the robot is shifted
shift_value (float) β In meter for X/Y/Z and radians for roll/pitch/yaw
linear (bool) β Whether the movement has to be linear or not
- Return type:
None
- shift_linear_pose(axis, shift_value)ο
Deprecated since version 1.2.0: You should use
shift_pose()
with linear=True.Shift robot end effector pose along one axis, with a linear trajectory
- Parameters:
axis (RobotAxis) β Axis along which the robot is shifted
shift_value (float) β In meter for X/Y/Z and radians for roll/pitch/yaw
- Return type:
None
- jog_joints(*args)ο
Deprecated since version 1.2.0: You should use
jog()
with aJointsPosition
object.Jog robot jointsβ. Jog corresponds to a shift without motion planning. Values are expressed in radians.
- Parameters:
args (Union[list[float], tuple[float]]) β either 6 args (1 for each joints) or a list of 6 joints offset
- Return type:
None
- jog_pose(*args)ο
Deprecated since version 1.2.0: You should use
jog()
with aPoseObject
object.Jog robot end effector pose Jog corresponds to a shift without motion planning Arguments are [dx, dy, dz, d_roll, d_pitch, d_yaw] dx, dy & dz are expressed in meters / d_roll, d_pitch & d_yaw are expressed in radians
- Parameters:
args (Union[list[float], tuple[float]]) β either 6 args (1 for each coordinates) or a list of 6 offset
- Return type:
None
- jog(robot_position)ο
- move_to_home_pose()ο
Move to a position where the forearm lays on shoulder
- Return type:
None
- go_to_sleep()ο
Go to home pose and activate learning mode
- Return type:
None
- forward_kinematics(*args)ο
Compute forward kinematics of a given joints configuration and give the associated spatial pose
- Parameters:
args (Union[list[float], tuple[float], JointsPosition]) β either 6 args (1 for each joints) or a list of 6 joints or a JointsPosition instance
- Return type:
- inverse_kinematics(*args)ο
Compute inverse kinematics
- Parameters:
args (Union[tuple[float], list[float], PoseObject]) β either 6 args (1 for each coordinate) or a list of 6 coordinates or a :class:`PoseObject`
- Returns:
List of joints value
- Return type:
list[float]
- forward_kinematics_v2(joints_position)ο
Compute forward kinematics of a given joints configuration and give the associated spatial pose
- Parameters:
joints_position (JointsPosition) β Joints configuration
- Return type:
- inverse_kinematics_v2(pose)ο
Compute inverse kinematics
- Parameters:
pose (PoseObject) β Robot pose
- Returns:
the joint position
- Return type:
- get_pose_saved(pose_name)ο
Get pose saved in from Nedβs memory
- Parameters:
pose_name (str) β Pose name in robotβs memory
- Returns:
Pose associated to pose_name
- Return type:
- save_pose(pose_name, *args)ο
Save pose in robotβs memory
- Parameters:
args (Union[list[float], tuple[float], PoseObject]) β either 6 args (1 for each coordinates) or a list of 6 coordinates or a PoseObject
- Return type:
None
- delete_pose(pose_name)ο
Delete pose from robotβs memory
- Return type:
None
- get_saved_pose_list()ο
Get list of posesβ name saved in robot memory
- Return type:
list[str]
- pick_from_pose(*args)ο
Deprecated since version 1.2.0: You should use
pick()
with aPoseObject
object.Execute a picking from a pose.
A picking is described as :
* going over the object* going down until height = z* grasping with tool* going back over the object- Parameters:
args (Union[list[float], tuple[float], PoseObject]) β either 6 args (1 for each coordinates) or a list of 6 coordinates or a PoseObject
- Return type:
None
- place_from_pose(*args)ο
Deprecated since version 1.2.0: You should use
place()
with aPoseObject
object.Execute a placing from a position.
A placing is described as :
* going over the place* going down until height = z* releasing the object with tool* going back over the place- Parameters:
args (Union[list[float], tuple[float], PoseObject]) β either 6 args (1 for each coordinates) or a list of 6 coordinates or a PoseObject
- Return type:
None
- pick(pick_position)ο
Execute a picking from a position.
A picking is described as :
* going over the object* going down until height = z* grasping with tool* going back over the object- Parameters:
pick_position (Union[JointsPosition, PoseObject]) β a pick position, can be either a Pose or a JointsPosition object
- Return type:
None
- place(place_position)ο
Execute a placing from a position.
A placing is described as :
* going over the place* going down until height = z* releasing the object with tool* going back over the place- Parameters:
place_position (Union[JointsPosition, PoseObject]) β a place position, can be either a Pose or a JointsPosition object
- Return type:
None
- pick_and_place(pick_pose, place_pos, dist_smoothing=0.0)ο
Execute a pick then a place
- Parameters:
pick_pose (Union[list[float], PoseObject]) β Pick Pose : [x, y, z, roll, pitch, yaw] or PoseObject
place_pos (Union[list[float], PoseObject]) β Place Pose : [x, y, z, roll, pitch, yaw] or PoseObject
dist_smoothing (float) β Distance from waypoints before smoothing trajectory
- Return type:
None
- get_trajectory_saved(trajectory_name)ο
Get trajectory saved in Nedβs memory
- Returns:
Trajectory
- Return type:
list[Joints]
- get_saved_trajectory_list()ο
Get list of trajectoriesβ name saved in robot memory
- Return type:
list[str]
- execute_registered_trajectory(trajectory_name)ο
Execute trajectory from Nedβs memory
- Return type:
None
- execute_trajectory(robot_positions, dist_smoothing=0.0)ο
Execute trajectory from list of poses and / or joints
- Parameters:
robot_positions (list[Union[JointsPosition, PoseObject]]) β List of poses or joints
dist_smoothing (float) β Distance from waypoints before smoothing trajectory
- Return type:
None
- execute_trajectory_from_poses(list_poses, dist_smoothing=0.0)ο
Deprecated since version 1.2.0: You should use
execute_trajectory()
withPoseObject
objects.Execute trajectory from list of poses
- Parameters:
list_poses (list[list[float]]) β List of [x,y,z,qx,qy,qz,qw] or list of [x,y,z,roll,pitch,yaw]
dist_smoothing (float) β Distance from waypoints before smoothing trajectory
- Return type:
None
- execute_trajectory_from_poses_and_joints(list_pose_joints, list_type=None, dist_smoothing=0.0)ο
Deprecated since version 1.2.0: You should use
execute_trajectory()
withPoseObject
andJointsPosition
objects.Execute trajectory from list of poses and joints
Example:
robot.execute_trajectory_from_poses_and_joints( list_pose_joints = [[0.0, 0.0, 0.0, 0.0, 0.0, 0.0], [0.25, 0.0, 0.0, 0.0, 0.0, 0.0]], list_type = ['joint', 'pose'], dist_smoothing = 0.01 )
- Parameters:
list_pose_joints (list[list[float]]) β List of [x,y,z,qx,qy,qz,qw] or list of [x,y,z,roll,pitch,yaw] or a list of [j1,j2,j3,j4,j5,j6]
list_type (list[string]) β List of string βposeβ or βjointβ, or [βposeβ] (if poses only) or [βjointβ] (if joints only). If None, it is assumed there are only poses in the list.
dist_smoothing (float) β Distance from waypoints before smoothing trajectory
- Return type:
None
- save_trajectory(trajectory, trajectory_name, trajectory_description)ο
Save trajectory in robot memory
- Parameters:
trajectory (list[list[float] | JointsPosition | PoseObject]) β list of Joints [j1, j2, j3, j4, j5, j6] as waypoints to create the trajectory
trajectory_name (str) β Name you want to give to the trajectory
trajectory_description β Description you want to give to the trajectory
- Return type:
None
- save_last_learned_trajectory(name, description)ο
Save last user executed trajectory
- Return type:
None
- update_trajectory_infos(name, new_name, new_description)ο
β Update trajectory infos
- Parameters:
name (str) β current name of the trajectory you want to update infos
new_name (str) β new name you want to give the trajectory
new_description (str) β new description you want to give the trajectory
- Return type:
None
- delete_trajectory(trajectory_name)ο
Delete trajectory from robotβs memory
- Return type:
None
- clean_trajectory_memory()ο
Delete trajectory from robotβs memory
- Return type:
None
- property toolο
- update_tool()ο
Update equipped tool
- Return type:
None
- grasp_with_tool()ο
Grasp with tool | This action correspond to | - Close gripper for Grippers | - Pull Air for Vacuum pump | - Activate for Electromagnet
- Return type:
None
- release_with_tool()ο
Release with tool | This action correspond to | - Open gripper for Grippers | - Push Air for Vacuum pump | - Deactivate for Electromagnet
- Return type:
None
- open_gripper(speed=500, max_torque_percentage=100, hold_torque_percentage=30)ο
Open gripper
- Parameters:
speed (int) β Between 100 & 1000 (only for Niryo One and Ned1)
max_torque_percentage (int) β Closing torque percentage (only for Ned2)
hold_torque_percentage (int) β Hold torque percentage after closing (only for Ned2)
- Return type:
None
- close_gripper(speed=500, max_torque_percentage=100, hold_torque_percentage=20)ο
Close gripper
- Parameters:
speed (int) β Between 100 & 1000 (only for Niryo One and Ned1)
max_torque_percentage (int) β Opening torque percentage (only for Ned2)
hold_torque_percentage (int) β Hold torque percentage after opening (only for Ned2)
- Return type:
None
- pull_air_vacuum_pump()ο
Pull air of vacuum pump
- Return type:
None
- push_air_vacuum_pump()ο
Push air of vacuum pump
- Return type:
None
- setup_electromagnet(pin_id)ο
Setup electromagnet on pin
- Parameters:
pin_id (PinID or str)
- Return type:
None
- activate_electromagnet(pin_id)ο
Activate electromagnet associated to electromagnet_id on pin_id
- Parameters:
pin_id (PinID or str)
- Return type:
None
- deactivate_electromagnet(pin_id)ο
Deactivate electromagnet associated to electromagnet_id on pin_id
- Parameters:
pin_id (PinID or str)
- Return type:
None
- enable_tcp(enable=True)ο
Enables or disables the TCP function (Tool Center Point). If activation is requested, the last recorded TCP value will be applied. The default value depends on the gripper equipped. If deactivation is requested, the TCP will be coincident with the tool_link.
- Parameters:
enable (Bool) β True to enable, False otherwise.
- Return type:
None
- set_tcp(*args)ο
Activates the TCP function (Tool Center Point) and defines the transformation between the tool_link frame and the TCP frame.
- Parameters:
args (Union[list[float], tuple[float], PoseObject]) β either 6 args (1 for each coordinates) or a list of 6 coordinates or a PoseObject
- Return type:
None
- reset_tcp()ο
Reset the TCP (Tool Center Point) transformation. The TCP will be reset according to the tool equipped.
- Return type:
None
- tool_reboot()ο
Reboot the motor of the tool equparam_list = [workspace_name]
Example:
for pose in (pose_origin, pose_2, pose_3, pose_4): pose_list = self.__args_pose_to_list(pose) param_list.append(pose_list)ipped. Useful when an Overload error occurs. (cf HardwareStatus)
- Return type:
None
- get_tcp()ο
- set_pin_mode(pin_id, pin_mode)ο
Set pin number pin_id to mode pin_mode
- property digital_io_stateο
- get_digital_io_state()ο
Get Digital IO state : Names, modes, states.
Example:
digital_io_state = robot.digital_io_state digital_io_state = robot.get_digital_io_state()
- Returns:
List of DigitalPinObject instance
- Return type:
list[DigitalPinObject]
- digital_write(pin_id, digital_state)ο
Set pin_id state to digital_state
- digital_read(pin_id)ο
Read pin number pin_id and return its state
- property analog_io_stateο
- get_analog_io_state()ο
Get Analog IO state : Names, modes, states
Example:
analog_io_state = robot.analog_io_state analog_io_state = robot.get_analog_io_state()
- Returns:
List of AnalogPinObject instance
- Return type:
list[AnalogPinObject]
- analog_write(pin_id, value)ο
Set and analog pin_id state to a value
- Parameters:
pin_id (PinID or str)
value (float) β voltage between 0 and 5V
- Return type:
None
- analog_read(pin_id)ο
Read the analog pin value
- Parameters:
pin_id (PinID or str)
- Return type:
float
- property custom_button_stateο
- get_custom_button_state()ο
Get the Ned2βs custom button state
- Returns:
True if pressed, False else
- Return type:
bool
- property hardware_statusο
- get_hardware_status()ο
Get hardware status : Temperature, Hardware version, motors names & types β¦
- Returns:
Infos contains in a HardwareStatusObject
- Return type:
- set_conveyor()ο
Activate a new conveyor and return its ID
- Returns:
New conveyor ID
- Return type:
- unset_conveyor(conveyor_id)ο
Remove specific conveyor.
- Parameters:
conveyor_id (ConveyorID) β Basically, ConveyorID.ONE or ConveyorID.TWO
- run_conveyor(conveyor_id, speed=50, direction=ConveyorDirection.FORWARD)ο
Run conveyor at id βconveyor_idβ
- Parameters:
conveyor_id (ConveyorID)
speed (int)
direction (ConveyorDirection)
- Return type:
None
- stop_conveyor(conveyor_id)ο
Stop conveyor at id βconveyor_idβ
- Parameters:
conveyor_id (ConveyorID)
- Return type:
None
- control_conveyor(conveyor_id, control_on, speed, direction)ο
Control conveyor at id βconveyor_idβ
- Parameters:
conveyor_id (ConveyorID)
control_on (bool)
speed (int) β New speed which is a percentage of maximum speed
direction (ConveyorDirection) β Conveyor direction
- Return type:
None
- get_connected_conveyors_id()ο
- Returns:
List of the connected conveyorsβ ID
- Return type:
list[ConveyorID]
- get_conveyors_feedback()ο
Get the feedback of the conveyors - conveyor id - direction - connection state - running - speed
- Returns:
List of the conveyorsβ feedback
- Return type:
list[ConveyorFeedback]
- get_img_compressed()ο
Get image from video stream in a compressed format. Use
uncompress_image
from the vision package to uncompress it- Returns:
string containing a JPEG compressed image
- Return type:
str
- set_brightness(brightness_factor)ο
Modify video stream brightness
- Parameters:
brightness_factor (float) β How much to adjust the brightness. 0.5 will give a darkened image, 1 will give the original image while 2 will enhance the brightness by a factor of 2.
- Return type:
None
- set_contrast(contrast_factor)ο
Modify video stream contrast
- Parameters:
contrast_factor (float) β While a factor of 1 gives original image. Making the factor towards 0 makes the image greyer, while factor>1 increases the contrast of the image.
- Return type:
None
- set_saturation(saturation_factor)ο
Modify video stream saturation
- Parameters:
saturation_factor (float) β How much to adjust the saturation. 0 will give a black and white image, 1 will give the original image while 2 will enhance the saturation by a factor of 2.
- Return type:
None
- get_image_parameters()ο
Get last stream image parameters: Brightness factor, Contrast factor, Saturation factor.
Brightness factor: How much to adjust the brightness. 0.5 will give a darkened image, 1 will give the original image while 2 will enhance the brightness by a factor of 2.
Contrast factor: A factor of 1 gives original image. Making the factor towards 0 makes the image greyer, while factor>1 increases the contrast of the image.
Saturation factor: 0 will give a black and white image, 1 will give the original image while 2 will enhance the saturation by a factor of 2.
- Returns:
Brightness factor, Contrast factor, Saturation factor
- Return type:
float, float, float
- get_target_pose_from_rel(workspace_name, height_offset, x_rel, y_rel, yaw_rel)ο
Given a pose (x_rel, y_rel, yaw_rel) relative to a workspace, this function returns the robot pose in which the current tool will be able to pick an object at this pose.
The height_offset argument (in m) defines how high the tool will hover over the workspace. If height_offset = 0, the tool will nearly touch the workspace.
- Parameters:
workspace_name (str) β name of the workspace
height_offset (float) β offset between the workspace and the target height
x_rel (float) β x relative pose (between 0 and 1)
y_rel (float) β y relative pose (between 0 and 1)
yaw_rel (float) β Angle in radians
- Returns:
target_pose
- Return type:
- get_target_pose_from_cam(workspace_name, height_offset=0.0, shape=ObjectShape.ANY, color=ObjectColor.ANY)ο
First detects the specified object using the camera and then returns the robot pose in which the object can be picked with the current tool
- Parameters:
workspace_name (str) β name of the workspace
height_offset (float) β offset between the workspace and the target height
shape (ObjectShape) β shape of the target
color (ObjectColor) β color of the target
- Returns:
object_found, object_pose, object_shape, object_color
- Return type:
(bool, PoseObject, ObjectShape, ObjectColor)
- vision_pick(workspace_name, height_offset=0.0, shape=ObjectShape.ANY, color=ObjectColor.ANY, obs_pose=None)ο
Picks the specified object from the workspace. This function has multiple phases:
1. detect object using the camera2. prepare the current tool for picking3. approach the object4. move down to the correct picking pose5. actuate the current tool6. lift the objectExample:
robot = NiryoRobot(ip_address="x.x.x.x") robot.calibrate_auto() robot.move_pose(<observation_pose>) obj_found, shape_ret, color_ret = robot.vision_pick(<workspace_name>, height_offset=0.0, shape=ObjectShape.ANY, color=ObjectColor.ANY)
- Parameters:
workspace_name (str) β name of the workspace
height_offset (float) β offset between the workspace and the target height
shape (ObjectShape) β shape of the target
color (ObjectColor) β color of the target
obs_pose (PoseObject) β An optional observation pose
- Returns:
object_found, object_shape, object_color
- Return type:
(bool, ObjectShape, ObjectColor)
- move_to_object(workspace_name, height_offset, shape, color)ο
Same as get_target_pose_from_cam but directly moves to this position
- Parameters:
workspace_name (str) β name of the workspace
height_offset (float) β offset between the workspace and the target height
shape (ObjectShape) β shape of the target
color (ObjectColor) β color of the target
- Returns:
object_found, object_shape, object_color
- Return type:
(bool, ObjectShape, ObjectColor)
- detect_object(workspace_name, shape=ObjectShape.ANY, color=ObjectColor.ANY)ο
Detect object in workspace and return its pose and characteristics
- Parameters:
workspace_name (str) β name of the workspace
shape (ObjectShape) β shape of the target
color (ObjectColor) β color of the target
- Returns:
object_found, object_rel_pose, object_shape, object_color
- Return type:
(bool, list, str, str)
- get_camera_intrinsics()ο
Get calibration object: camera intrinsics, distortions coefficients
- Returns:
camera intrinsics, distortions coefficients
- Return type:
(list[list[float]], list[list[float]])
- save_workspace_from_robot_poses(workspace_name, pose_origin, pose_2, pose_3, pose_4)ο
Save workspace by giving the poses of the robot to point its 4 corners with the calibration Tip. Corners should be in the good order. Markersβ pose will be deduced from these poses
Poses should be either a list [x, y, z, roll, pitch, yaw] or a PoseObject
- Parameters:
workspace_name (str) β workspace name, maximum length 30 char.
pose_origin (Union[list[float], PoseObject])
pose_2 (Union[list[float], PoseObject])
pose_3 (Union[list[float], PoseObject])
pose_4 (Union[list[float], PoseObject])
- Return type:
None
- save_workspace_from_points(workspace_name, point_origin, point_2, point_3, point_4)ο
Save workspace by giving the points of worskpaceβs 4 corners. Points are written as [x, y, z] Corners should be in the good order.
- Parameters:
workspace_name (str) β workspace name, maximum length 30 char.
point_origin (list[float])
point_2 (list[float])
point_3 (list[float])
point_4 (list[float])
- Return type:
None
- delete_workspace(workspace_name)ο
Delete workspace from robotβs memory
- Parameters:
workspace_name (str)
- Return type:
None
- get_workspace_poses(workspace_name)ο
- get_workspace_ratio(workspace_name)ο
Get workspace ratio from robotβs memory
- Parameters:
workspace_name (str)
- Return type:
float
- get_workspace_list()ο
Get list of workspacesβ name store in robotβs memory
- Return type:
list[str]
- get_saved_dynamic_frame_list()ο
Get list of saved dynamic frames
Example:
list_frame, list_desc = robot.get_saved_dynamic_frame_list() print(list_frame) print(list_desc)
- Returns:
list of dynamic frames name, list of description of dynamic frames
- Return type:
list[str], list[str]
- get_saved_dynamic_frame(frame_name)ο
Get name, description and pose of a dynamic frame
Example:
frame = robot.get_saved_dynamic_frame("default_frame")
- Parameters:
frame_name (str) β name of the frame
- Returns:
name, description, position and orientation of a frame
- Return type:
list[str, str, list[float]]
- save_dynamic_frame_from_poses(frame_name, description, pose_origin, pose_x, pose_y, belong_to_workspace=False)ο
Create a dynamic frame with 3 poses (origin, x, y)
Example:
pose_o = [0.1, 0.1, 0.1, 0, 0, 0] pose_x = [0.2, 0.1, 0.1, 0, 0, 0] pose_y = [0.1, 0.2, 0.1, 0, 0, 0] robot.save_dynamic_frame_from_poses("name", "une description test", pose_o, pose_x, pose_y)
- Parameters:
frame_name (str) β name of the frame
description (str) β description of the frame
pose_origin (list[float] [x, y, z, roll, pitch, yaw]) β pose of the origin of the frame
pose_x (list[float] [x, y, z, roll, pitch, yaw]) β pose of the point x of the frame
pose_y (list[float] [x, y, z, roll, pitch, yaw]) β pose of the point y of the frame
belong_to_workspace (boolean) β indicate if the frame belong to a workspace
- Returns:
None
- save_dynamic_frame_from_points(frame_name, description, point_origin, point_x, point_y, belong_to_workspace=False)ο
Create a dynamic frame with 3 points (origin, x, y)
Example:
point_o = [-0.1, -0.1, 0.1] point_x = [-0.2, -0.1, 0.1] point_y = [-0.1, -0.2, 0.1] robot.save_dynamic_frame_from_points("name", "une description test", point_o, point_x, point_y)
- Parameters:
frame_name (str) β name of the frame
description (str) β description of the frame
point_origin (list[float] [x, y, z]) β origin point of the frame
point_x (list[float] [x, y, z]) β point x of the frame
point_y (list[float] [x, y, z]) β point y of the frame
belong_to_workspace (boolean) β indicate if the frame belong to a workspace
- Returns:
None
- edit_dynamic_frame(frame_name, new_frame_name, new_description)ο
Modify a dynamic frame
Example:
robot.edit_dynamic_frame("name", "new_name", "new description")
- Parameters:
frame_name (str) β name of the frame
new_frame_name (str) β new name of the frame
new_description (str) β new description of the frame
- Returns:
None
- delete_dynamic_frame(frame_name, belong_to_workspace=False)ο
Delete a dynamic frame
Example:
robot.delete_saved_dynamic_frame("name")
- Parameters:
frame_name (str) β name of the frame to remove
belong_to_workspace (boolean) β indicate if the frame belong to a workspace
- Returns:
None
- move_relative(offset, frame='world')ο
Deprecated since version 1.2.0: You should use
move()
with a frame in the pose metadata.Move robot end of an offset in a frame
- Example: ::
robot.move_relative([0.05, 0.05, 0.05, 0.3, 0, 0], frame=βdefault_frameβ)
- Parameters:
offset (list[float]) β list which contains offset of x, y, z, roll, pitch, yaw
frame (str) β name of local frame
- Returns:
None
- move_linear_relative(offset, frame='world')ο
Deprecated since version 1.2.0: You should use
move()
with a frame in the pose metadata and linear=True.Move robot end of an offset by a linear movement in a frame
Example:
robot.move_linear_relative([0.05, 0.05, 0.05, 0.3, 0, 0], frame="default_frame")
- Parameters:
offset (list[float]) β list which contains offset of x, y, z, roll, pitch, yaw
frame (str) β name of local frame
- Returns:
None
- get_sounds()ο
Get sound name list
- Returns:
list of the sounds of the robot
- Return type:
list[string]
- play_sound(sound_name, wait_end=True, start_time_sec=0, end_time_sec=0)ο
Play a sound from the robot
- Parameters:
sound_name (string) β Name of the sound to play
wait_end (bool) β wait for the end of the sound before exiting the function
start_time_sec (float) β start the sound from this value in seconds
end_time_sec (float) β end the sound at this value in seconds
- Return type:
None
- set_volume(sound_volume)ο
Set the volume percentage of the robot.
- Parameters:
sound_volume (int) β volume percentage of the sound (0: no sound, 100: max sound)
- Return type:
None
- stop_sound()ο
Stop a sound being played.
- Return type:
None
- get_sound_duration(sound_name)ο
Returns the duration in seconds of a sound stored in the robot database raise SoundRosWrapperException if the sound doesnβt exists
- Parameters:
sound_name (string) β name of sound
- Returns:
sound duration in seconds
- Return type:
float
- say(text, language=0)ο
Use gtts (Google Text To Speech) to interprete a string as sound Languages available are: * English: 0 * French: 1 * Spanish: 2 * Mandarin: 3 * Portuguese: 4
Example
robot.say("Hello", 0) robot.say("Bonjour", 1) robot.say("Hola", 2)
- Parameters:
text (string) β Text that needs to be spoken < 100 char
language (int) β language of the text
- Return type:
None
- set_led_color(led_id, color)ο
Lights up an LED in one colour. RGB colour between 0 and 255.
Example:
robot.set_led_color(5, [15, 50, 255])
- Parameters:
led_id (int) β Id of the led: between 0 and 29
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
- led_ring_solid(color)ο
Set the whole Led Ring to a fixed color.
Example:
robot.led_ring_solid([15, 50, 255])
- Parameters:
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
- led_ring_turn_off()ο
Turn off all LEDs
Example:
robot.led_ring_turn_off()
- led_ring_flashing(color, period=0, iterations=0, wait=False)ο
Flashes a color according to a frequency. The frequency is equal to 1 / period.
Examples:
robot.led_ring_flashing([15, 50, 255]) robot.led_ring_flashing([15, 50, 255], 1, 100, True) robot.led_ring_flashing([15, 50, 255], iterations=20, wait=True) frequency = 20 # Hz total_duration = 10 # seconds robot.flashing([15, 50, 255], 1./frequency, total_duration * frequency , True)
- Parameters:
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive flashes. If 0, the Led Ring flashes endlessly.
wait (bool) β The service wait for the animation to finish all iterations or not to answer. If iterations is 0, the service answers immediately.
- led_ring_alternate(color_list, period=0, iterations=0, wait=False)ο
Several colors are alternated one after the other.
Examples:
color_list = [ [15, 50, 255], [255, 0, 0], [0, 255, 0], ] robot.led_ring_alternate(color_list) robot.led_ring_alternate(color_list, 1, 100, True) robot.led_ring_alternate(color_list, iterations=20, wait=True)
- Parameters:
color_list (list[list[float]]) β Led color list of lists of size 3[R, G, B]. RGB channels from 0 to 255.
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive alternations. If 0, the Led Ring alternates endlessly.
wait (bool) β The service wait for the animation to finish all iterations or not to answer. If iterations is 0, the service answers immediately.
- led_ring_chase(color, period=0, iterations=0, wait=False)ο
Movie theater light style chaser animation.
Examples:
robot.led_ring_chase([15, 50, 255]) robot.led_ring_chase([15, 50, 255], 1, 100, True) robot.led_ring_chase([15, 50, 255], iterations=20, wait=True)
- Parameters:
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive chase. If 0, the animation continues endlessly. One chase just lights one Led every 3 LEDs.
wait (bool) β The service wait for the animation to finish all iterations or not to answer. If iterations is 0, the service answers immediately.
- led_ring_wipe(color, period=0, wait=False)ο
Wipe a color across the Led Ring, light a Led at a time.
Examples:
robot.led_ring_wipe([15, 50, 255]) robot.led_ring_wipe([15, 50, 255], 1, True) robot.led_ring_wipe([15, 50, 255], wait=True)
- Parameters:
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
wait (bool) β The service wait for the animation to finish or not to answer.
- led_ring_rainbow(period=0, iterations=0, wait=False)ο
Draw rainbow that fades across all LEDs at once.
Examples:
robot.led_ring_rainbow() robot.led_ring_rainbow(5, 2, True) robot.led_ring_rainbow(wait=True)
- Parameters:
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive rainbows. If 0, the animation continues endlessly.
wait (bool) β The service wait for the animation to finish or not to answer. If iterations is 0, the service answers immediately.
- led_ring_rainbow_cycle(period=0, iterations=0, wait=False)ο
Draw rainbow that uniformly distributes itself across all LEDs.
Examples:
robot.led_ring_rainbow_cycle() robot.led_ring_rainbow_cycle(5, 2, True) robot.led_ring_rainbow_cycle(wait=True)
- Parameters:
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive rainbow cycles. If 0, the animation continues endlessly.
wait (bool) β The service wait for the animation to finish or not to answer. If iterations is 0, the service answers immediately.
- led_ring_rainbow_chase(period=0, iterations=0, wait=False)ο
Rainbow chase animation, like the led_ring_chase method.
Examples:
robot.led_ring_rainbow_chase() robot.led_ring_rainbow_chase(5, 2, True) robot.led_ring_rainbow_chase(wait=True)
- Parameters:
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive rainbow cycles. If 0, the animation continues endlessly.
wait (bool) β The service wait for the animation to finish or not to answer. If iterations is 0, the service answers immediately.
- led_ring_go_up(color, period=0, iterations=0, wait=False)ο
LEDs turn on like a loading circle, and are then all turned off at once.
Examples:
robot.led_ring_go_up([15, 50, 255]) robot.led_ring_go_up([15, 50, 255], 1, 100, True) robot.led_ring_go_up([15, 50, 255], iterations=20, wait=True)
- Parameters:
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive turns around the Led Ring. If 0, the animation continues endlessly.
wait (bool) β The service wait for the animation to finish or not to answer. If iterations is 0, the service answers immediately.
- led_ring_go_up_down(color, period=0, iterations=0, wait=False)ο
LEDs turn on like a loading circle, and are turned off the same way.
Examples:
robot.led_ring_go_up_down([15, 50, 255]) robot.led_ring_go_up_down([15, 50, 255], 1, 100, True) robot.led_ring_go_up_down([15, 50, 255], iterations=20, wait=True)
- Parameters:
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive turns around the Led Ring. If 0, the animation continues endlessly.
wait (bool) β The service wait for the animation to finish or not to answer. If iterations is 0, the service answers immediately.
- led_ring_breath(color, period=0, iterations=0, wait=False)ο
Variation of the light intensity of the LED ring, similar to human breathing.
Examples:
robot.led_ring_breath([15, 50, 255]) robot.led_ring_breath([15, 50, 255], 1, 100, True) robot.led_ring_breath([15, 50, 255], iterations=20, wait=True)
- Parameters:
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
period (float) β Execution time for a pattern in seconds. If 0, the default time will be used.
iterations (int) β Number of consecutive turns around the Led Ring. If 0, the animation continues endlessly.
wait (bool) β The service wait for the animation to finish or not to answer. If iterations is 0, the service answers immediately.
- led_ring_snake(color, period=0, iterations=0, wait=False)ο
A small coloured snake (certainly a python :D ) runs around the LED ring.
Examples:
robot.led_ring_snake([15, 50, 255]) robot.led_ring_snake([15, 50, 255], 1, 100, True)
- Parameters:
color (list[float]) β Led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
period (float) β Execution time for a pattern in seconds. If 0, the default duration will be used.
iterations (int) β Number of consecutive turns around the Led Ring. If 0, the animation continues endlessly.
wait (bool) β The service wait for the animation to finish or not to answer. If iterations is 0, the service answers immediately.
- led_ring_custom(led_colors)ο
Sends a colour command to all LEDs of the LED ring. The function expects a list of colours for the 30 LEDs of the robot.
Example:
led_list = [[i / 30. * 255 , 0, 255 - i / 30.] for i in range(30)] robot.led_ring_custom(led_list)
- Parameters:
led_colors (list[list[float]]) β List of size 30 of led color in a list of size 3[R, G, B]. RGB channels from 0 to 255.
Exceptionsο
Exception classes for TCP
- exception pyniryo.api.exceptions.ClientNotConnectedException(msg=None)ο
- exception pyniryo.api.exceptions.HostNotReachableExceptionο
- exception pyniryo.api.exceptions.InvalidAnswerException(answer)ο
- exception pyniryo.api.exceptions.NiryoRobotExceptionο
- exception pyniryo.api.exceptions.TcpCommandExceptionο
Enumsο
- class pyniryo.api.enums_communication.CalibrateMode(*values)ο
Enumeration of Calibration Modes
- AUTO = 0ο
- MANUAL = 1ο
- class pyniryo.api.enums_communication.ConveyorDirection(*values)ο
Enumeration of Conveyor Directions used for Conveyor control
- BACKWARD = -1ο
- FORWARD = 1ο
- class pyniryo.api.enums_communication.ConveyorID(*values)ο
Enumeration of Conveyor IDs used for Conveyor control
- ID_1 = -1ο
- ID_2 = -2ο
- NONE = 0ο
- class pyniryo.api.enums_communication.ObjectColor(*values)ο
Enumeration of Colors available for image processing
- ANY = 'ANY'ο
- BLUE = 'BLUE'ο
- GREEN = 'GREEN'ο
- RED = 'RED'ο
- class pyniryo.api.enums_communication.ObjectShape(*values)ο
Enumeration of Shapes available for image processing
- ANY = 'ANY'ο
- CIRCLE = 'CIRCLE'ο
- SQUARE = 'SQUARE'ο
- class pyniryo.api.enums_communication.PinID(*values)ο
Enumeration of Robot Pins
- AI1 = 'AI1'ο
- AI2 = 'AI2'ο
- AO1 = 'AO1'ο
- AO2 = 'AO2'ο
- DI1 = 'DI1'ο
- DI2 = 'DI2'ο
- DI3 = 'DI3'ο
- DI4 = 'DI4'ο
- DI5 = 'DI5'ο
- DO1 = 'DO1'ο
- DO2 = 'DO2'ο
- DO3 = 'DO3'ο
- DO4 = 'DO4'ο
- GPIO_1A = '1A'ο
- GPIO_1B = '1B'ο
- GPIO_1C = '1C'ο
- GPIO_2A = '2A'ο
- GPIO_2B = '2B'ο
- GPIO_2C = '2C'ο
- SW_1 = 'SW1'ο
- SW_2 = 'SW2'ο
- class pyniryo.api.enums_communication.PinMode(*values)ο
Enumeration of Pin Modes
- INPUT = 1ο
- OUTPUT = 0ο
- class pyniryo.api.enums_communication.PinState(*values)ο
Pin State is either LOW or HIGH
- HIGH = Trueο
- LOW = Falseο
Python object classesο
- class pyniryo.api.objects.AnalogPinObject(pin_id, name, mode, value)ο
Object used to store information on digital pins
- class pyniryo.api.objects.DigitalPinObject(pin_id, name, mode, state)ο
Object used to store information on digital pins
- class pyniryo.api.objects.HardwareStatusObject(rpi_temperature, hardware_version, connection_up, error_message, calibration_needed, calibration_in_progress, motor_names, motor_types, motors_temperature, motors_voltage, hardware_errors)ο
Object used to store every hardware information
- class pyniryo.api.objects.JointsPosition(*joints, **kwargs)ο
Represents a robot position given by the position of each of its joints
- Parameters:
joints (float) β all the joints positions
- classmethod from_dict(d)ο
Creates a new JointsPosition object from a dictionary representing the object.
- Parameters:
d (dict) β A dictionary representing the object.
- to_dict()ο
- Returns:
A dictionary representing the object.
- Return type:
dict
- to_list()ο
- Returns:
A list containing all the joints positions.
- Return type:
list[float]
- class pyniryo.api.objects.PoseMetadata(version, frame='', length_unit=LengthUnit.METERS)ο
Represents all the metadata of a PoseObject.
- Variables:
version β The version of the metadata. Each new version adds more attributes. (use
v1()
orv2()
to quickly create a default metadata instance)frame β Name of the frame if the pose is relative to a frame other than the world.
length_unit β The length unit of the position (x, y, z). Default:
LengthUnit.METERS
- classmethod from_dict(d)ο
Creates a new object from a dictionary representing the object. :param d: A dictionary representing the object. :type d: dict :return: A new PoseMetadata instance. :rtype: PoseMetadata
- to_dict()ο
- Returns:
A dictionary representing the object.
- Return type:
dict
- classmethod v1(frame='')ο
Quickly creates a new PoseMetadata instance initialized as a V1 pose. :param frame: The frame of the pose to create. :type frame: str
- classmethod v2(frame='', length_unit=LengthUnit.METERS)ο
Quickly creates a new PoseMetadata instance initialized as a V2 pose.
- Parameters:
frame (str) β The frame of the pose to create. Default:
length_unit β The length unit of the position (x, y, z). Default:
- class pyniryo.api.objects.PoseObject(x, y, z, roll, pitch, yaw, metadata=<pyniryo.api.objects.PoseMetadata object>)ο
Pose object which stores x, y, z, roll, pitch & yaw parameters
- Parameters:
x (float) β The x coordinate of the pose.
y (float) β The y coordinate of the pose.
z (float) β The z coordinate of the pose.
roll (float) β The roll angle of the pose.
pitch (float) β The pitch angle of the pose.
yaw (float) β The yaw angle of the pose.
metadata (PoseMetadata) β The metadata of the pose.
- copy_with_offsets(x_offset=0.0, y_offset=0.0, z_offset=0.0, roll_offset=0.0, pitch_offset=0.0, yaw_offset=0.0)ο
Create a new pose from copying actual pose with offsets
- Parameters:
x_offset (float) β Offset of the x coordinate of the new pose.
y_offset (float) β Offset of the y coordinate of the new pose.
z_offset (float) β Offset of the z coordinate of the new pose.
roll_offset (float) β Offset of the roll angle of the new pose.
pitch_offset (float) β Offset of the pitch angle of the new pose.
yaw_offset (float) β Offset of the yaw angle of the new pose.
- Returns:
A new PoseObject with the applied offsets.
- Return type:
- classmethod from_dict(d)ο
Creates a new PoseObject from a dictionary representing the object.
- Parameters:
d (dict) β A dictionary representing the object.
- quaternion(normalization_tolerance=1e-05)ο
Convert the pose rotation (euler angles) to a quaternion.
- Parameters:
normalization_tolerance (float) β Tolerance for quaternion normalization.
- Returns:
A quaternion.
- Return type:
list
- to_dict()ο
- Returns:
A dictionary representing the object.
- Return type:
dict
- to_list()ο
- Returns:
A list [x, y, z, roll, pitch, yaw]
- Return type:
list[float]