Posts by mahboobelahi93

    Where is joint_pos coming from, and why are you overwriting lpos if you are pulling that from a PR? Why pull lpos from a PR at all? This is probably where your uninit error is coming from.

    Joint POS is coming from another application(App2) running in the cloud. App2 receives robot's work cell picture and process it with CNN to detect object to pick and does geometric transformations and provide joint angles for the the object that needs to be picked.

    My middleware application (Python script) receives the Following is the message received from App2:

    The joint angles in the message are dummy (taught randomly for testing). J1 and j2 are reachable locations on the other hand J3 is not readable.

    I a position is not reachable then middleware starts the camera cycle for a new picture and sends it to App2. The following picture shows the big picture of the use case. A lot of steps for APP1(middleware) has been completed.

    Thanks, Nation for your response.

    Here is my full code, now I am having error at:

    Hey there,


    in deed there are two build-in functions for this purpose:
    POS_TO_JNTP for cartesian to joint position
    JNTP_TO_POS for joint to cartesian position.


    Check the manual chapter concerning build-in functions. There is a load of other useful functionality already implemented as well.

    Hi there,


    I think these instructions are not for R30ib robot controller. Which instructions can be used for these conversions?

    Hi all,


    I was looking KAREL "reference programming manual " in section B.8, author used "CHECK_EPOS" built-in with an "XYZWPR" data type and I am also trying to do but I am having a warning from ktrans:


    113 CHECK_EPOS(lpos,UFRAME,UTOOL, 1,STATUS)

    ^ WARNING

    Argument will be passed by value. Id: LPOS


    Here is my code:

    Code
      GET_VAR(entry, '*SYSTEM*', '$MNUFRAME[1,1]', UFRAME, STATUS)
      GET_VAR(entry, '*SYSTEM*', '$UTOOL[1,1]', UTOOL, STATUS)
      lpos= GET_POS_REG(reg_ID_CART,STATUS,1)
      CHECK_EPOS(lpos,UFRAME,UTOOL, 1,STATUS)
      WRITE('CHECK_EPOS',lpos,STATUS,CR)

    Also, when that KAREL program is run on TP I got an "uninitialized data" at

    Code
    CHECK_EPOS(lpos,UFRAME,UTOOL, 1,STATUS)

    any help will be appriciated.


    Thanks in advance

    Hi all,


    Thank you for providing such comprehensive information.

    We have the following FANUC robot cell LR-Mate. I am involved in a camera-guided pick and place task of that robot but in this task, I need to bypass the iRVision controller of the robot and use external pre-trained models for object detection.


    Description of Works:

    My work involves capturing robot workspace pictures and process this picture by pre-trained models for basic object recognition like CNN or other DL algorithms to detect and localize objects and The position of the object in the image is going to be transformed with simple geometric transformations to the cartesian space of the robot and issues trajectories to the robot for picking an object from the workspace.

    I was able to

    1. retrieve workspace images through the robot's FTP server
    2. can command the robot from a python script to move to position
    3. a pre-trained CNN model can detect the objects in the workspace picture

    To complete this task I need the following information (I was looking on google but can not understand :frowning_face:

    1. Which frame (UF9 or UF0(world frame)) to use for transforming the position of the object in the image to the cartesian space of the robot so the robot can move to this point?

    during my search I find the following information to complete my task:

    1. The image frame origin with respect to the robot base frame.(I don't know what is meant by image frame but my understanding is, this frame needs to be UF9 but I am not sure)
    2. mm per pixel of the image.(this may be magnification value. Does It is constant for camera settings? )


    looking for your response.

    Thanks to all.

    Hello experts,


    Can someone explain the difference between

    • the Application frame, offset frame, and Grid calibration frame in the context of the FANUC iRVision tool? I have an R30bi controller and its frame 9 is being used as an application frame, offset frame, and Grid calibration frame in our vision application
    • From the above-mentioned frames, Which frame provides The image frame origin with respect to the robot base frame?


    The first suggestion I have to is run WireShark on the computer running the python app, and examine the raw packets.

    Hi SkyFire,


    I modified my python script like this:


    JSON_DATA[data[0]] =dict([("XYZ",[float(i) for i in data[1:4]]),
                                                    ("WPR",[float(i) for i in data[4:7]]),
                                                    ("CONFIG","NUT000")])

    now my JSON data is ok but in the console, there is still data overlap. I have checked data packets in WireShark and packet bytes are random ranging between 70-90 bytes.


    in KAREL i am accessing Frames data like:

    GET_VAR(entry, '*SYSTEM*', '$MNUFRAME[1,9]', camframe, STATUS)

      WRITE file_var('camera_frame') --12 bytes (not sure)

    WRITE file_var(camframe ,CR) --56bytes

    --- writing other frames data follows same process 


    As camfram is a POSITION data so only 56bytes +12bytes only be written and the remaining bytes of write buffer must be handled by Carriage return but for me somehow it is not working.

    Thank you 1Cllif

    If VISION GET_NFOUND is not available, I would just call GET_OFFSET repeatedly until it fails (jumps to the label provided).

    Hi Jay,


    I am storing vision find POS to PR and i noticed the configuration of PR changes to NDB from NUT. Why configuration is changing in the following simple assignment?

    Code
    PR[1] = VR[1].FIND_POS(1)

    Here is the received data from the robot Socket server.

    I have not set any file attribute.

    writing data to file like :

    Waiting for response.

    Thanks a lot.

    Hi all,

    i have established socket communication between the robot's SM server and python App(SM client) but i am having trouble in handling data chunks coming from the Socket server. sample chunk:

    '--' will use as a delimiter that is used by the python app to split incoming data.I got the following list in the python application. The remaining data is missing.

    ['uframe_Data 411.096 215.795 -272.131\n .713 .059 -89.751\nN U T, 0, 0, 0', 'camframe_Data 332.167 77.494 -273.281\n .681 -.038 -89.856\nN U T, 0, 0, 0', 'utool_Data 0.000 0.000 115.000\n 0.000 0.000 0.000\nN']


    What is the correct way to receive all data from KAREL program that is responsible for handling Socket communication?

    i also tried following write commands

    Code
    DELAY 500                
    WRITE file_var('uframe_Data ')
    WRITE file_var(uframe ,CR)
    WRITE file_var('utool_Data ')
    WRITE file_var(utool ,CR)
    WRITE file_var('Home_POS ')
    WRITE file_var(lpos ,CR)
    DELAY 500

    Whit these write commands received data include a little portion of previous as well next data segment.


    I have the following two questions regarding the robot Socket server:

    1- how many clients can a robot server handle?

    2- how to check client dis-connections correctly? Can MSG_DISC() will work for it?, i have no success with handling client disconnections.

    i want when client disconnects then server stops sending data and wait for a reconnects.

    Code
    VAR
    pos1, pos2,pos3 : XYZWPR 
    pos4: POSITION
    
    BEGIN
     pos4 = pos2 : INV(pos1)
     pos3 = pos4
    
    END

    The robot in most times does automatic conversions between XYZWPR and POSITION.

    Hi,

    Is it possible to access fields of POSITION data just like XYZWPR?

    I have initialized

    uframe, utool : POSITION

    and gett the following error.


    What is the correct syntax to access fields of this position variable? This works for an XYZWPR

Advertising from our partners