Your browser was unable to load all of the resources. They may have been blocked by your firewall, proxy or browser configuration.
Press Ctrl+F5 or Ctrl+Shift+R to have your browser try again.

Is it possible to get TCP position if manipulator joint values are known? (Repost from Git Issues) #38

#1

qwasko1212 asks:

I need to convert robot joint rotations to Cartesian coordinates of the TCP.

Example:
joints: [val, val, val, val, val, val] → Cartesian coordinates: [x, y, z, roll, pitch, yaw]

The idea of this task is to compare real robot position with expected output. Real joints positions are read directly from the robot.
I have tried different methods, but none of them lead me to the proper result. Is it possible to perform this conversion using Python klampt?

  • replies 4
  • views 991
  • likes 0
krishauser · Team Author
#2

Yes, it just takes a few lines of code.

from klampt import *
from klampt.math import so3

#setup
world = WorldModel()
world.loadFile("myrobot.urdf")
robot = world.robot(0)

#query
endEffectorLink = robot.link(robot.numLinks()-1)
robot.setConfig(joints)
x,y,z = endEffectorLink.getWorldPosition([toolx,tooly,toolz])
rotation = endEffectorLink.getTransform()[0]
roll,pitch,yaw = so3.rpy(rotation)
#3

Thank you for your reply. I have a question. What is [toolx,tooly,toolz]?

krishauser · Team Author
#4

These would be the tool coordinates in the local frame of the end effector. (Default would be 0,0,0)

I used the following code for forward kinematics (UR5 robot arm).

def forw(self,config):
    """
    :param config: (6,)-np.ndarray
    :return: R ((3,3)-ndarray), t ((3,)-ndarray)
    """
    config = np.hstack((config, np.zeros(11))).tolist()
    link_ee = self.robot.link(self.objLinkNr)
    self.robot.setConfig(config)
    t = link_ee.getWorldPosition(link_ee.getLocalPosition([0,0,0]))
    R = link_ee.getTransform()[0]
    R = np.array(R).reshape(3, 3)
    assert(isRotationMatrix(R))
    t = np.array(t)
    return R, t

However, if I run p=forw(q1) and q2=ik_inv(p), then assert q1[k] == pytest.approx(q2[k], rel=1e-3) fails. Code for inverse kinematics:

def ik_inv(self,pose):
    """
    :param pose:  tuple ((3,3)-ndarray (rotation matrix), (3,)-ndarray (x,y,z) )
    :return: (6,)-ndarray
    """
    link = self.robot.link(self.objLinkNr)
    (R, t) = pose
    tList = t.tolist()
    R = [R[0,0], R[0,1], R[0,2], R[1,0], R[1,1], R[1,2], R[2,0],
         R[2,1], R[2,2]]
    obj = ik.objective(link, local=[1, 0, 0], world=tList, R=R)
    ik_solver = ik.solver(obj)
    numRestarts = 100
    solved = False
    res = ik_solver.solve()
    for i in range(numRestarts):
        ik_solver.sampleInitial()
        ik_solver.setMaxIters(100)
        ik_solver.setTolerance(1e-3)
        res = ik_solver.solve()
        if res:
            solved = True
            break
    q = np.array(self.robot.getConfig())[:6]
    return q

Values and errors for q1:
q1 = np.array([0.47, -1.208, -1.20, 1.1, 1.2, 0.01]): assert 0.46950542612450474 == 0.47 ± 4.7e-04
q1 = np.array([0.1, 0.1, 0.1, 0.1, 0.1, 0.1]): assert -0.8841181841083773 == 0.1 ± 1.0e-04
q1 = np.array([0.7, 0.1, 1.5, 0.1, 0.1, 0.1]): assert 0.7812173624608203 == 0.7 ± 7.0e-04