Posts by jay

    KAREL programs can reference variables from other programs via the FROM keyword:


    Code
    VAR
      other_var FROM other_prog : INTEGER


    KAREL also features the %INCLUDE directive which literally includes a file line-for-line during translation:


    Code
    VAR
      local_var : INTEGER
      -- includes other_vars.kl
      %INCLUDE other_vars


    Just make sure to use the FROM keyword, or the vars will just be duplicated in your new program.

    I've never had a problem backing up the KAREL translated binary PC files. The .KL source files will generally not be present on the robot, and I'm pretty sure you can't upload them to MD:, so they will not come down in a file backup.


    Creating a simulation from an image must be a new feature... glad to hear it's been added.

    A FILE > Backup All should save all your KAREL PC files just fine. My guess is it was an issue with the ROBOGUIDE serialization from backup process. You can check this by simply verifying the PC files exist in the backup you took from the physical robot.


    If there is an issue with ROBOGUIDE, you may just have to manually load the PC files after your workcell is created as an extra step.


    You cannot create a ROBOGUIDE workcell from an image backup, and I'm also fairly certain you can't do an image backup restore in ROBOGUIDE either.

    I wouldn't be surprised at small differences like this between major releases. The R-30iB platform has major changes from the ground up (system architecture, options, etc.) vs. R-30iA, similar to the R-J3iB to R-30iA transition. My guess is significant changes to Constant Path are causing your new motion profile for the same programs.

    Quote

    Yeah, I attempted to follow the manual to no avail. IE the TP doesn't have the Wiz that you speak of


    My guess is iRPickTool got rid of that TP menu in favor of a web-based wizard, but I haven't seen the iRPickTool manual yet.


    What is the sign (+/-) of your encoder scale? If the encoder counts decrease when moving the conveyor in your production direction, the encoder scale should be negative e.g. -30cnt/mm. If it's not, you can manually change the encoder scale from the Setup > Tracking menu (e.g. try switching the sign). However, if you are doing the setup correctly and end up with an incorrect scale, there is likely a larger problem.


    Quote

    The robot just takes off in the opposite direction when I run the sample tracking program from FANUC's manual (copied verbatim the code).


    What does the actual operation position look like (actual x,y,z components) in the example program on the robot? Did you teach a reference position after setting a trigger? This can help us track down whether the issue is confined to just the encoder scale.

    Munimula, KCONVARS requires, at minimum, a file argument. You can also provide several other options:


    Usage: KConvars [/p] infile [outfile][/config inifile]
    Valid file types:
    .sv - System variable file
    .vr - KAREL variable file
    .io - I/O configuration file
    /p - Pause after translating
    /config - Robot configuration file (default robot.ini)


    To select version, use /ver version_id,
    where version_id is an installed version.


    You can do this by either dragging your VR file onto the program (this will presumably create a VA file in the same dir as KCONVARS), or you can use a command prompt.

    Hi there,


    > My first question is, I have the encoder set up but it is currently counting backwards, how do I switch this to count forwards without remounting (not an option due to space).


    Why do you want it to count forwards? From a tracking perspective, the robot doesn't care which way it counts. Depending on the direction you'll either get a positive or negative encoder scale.


    > Question 2 comes to tracking frames, how do I define a frame as "Tracking" so I can finish the picktool setup? Any and all assistance is greatly appreciated.


    If this is your first time setting up PickTool, I'd highly recommend following the manual step by step. It covers everything from setting up the PT objects to teaching your tracking frames. If you do anything out of sequence or incorrectly, you may end up with a bad setup.


    To answer your question, tracking frames are different than regular USER frames. Tracking frames coincide with your tracking schedules (Setup > Tracking), not the Setup > Frames menu. It's basically a frame that moves in X relative to an your encoder trigger. I haven't had a chance to mess around much with the new IRPickTool, but the tracking frame setup used to be under Setup > PT Trk Wiz. Again, see the manual. There should be a step-by-step quick-start guide.

    Quote

    Are we dreaming or is this something we can do?


    The short answer is: yes, you can probably do this, as long as vision can distinguish between your parts.


    iRVision gives you many different types of vision processes (2d single view, 2d multi-view, depalletizing, 3DL single view, bin-pick search, etc.) and with each vision process, you have many different tools to use: GPM, blob, curved surface, edge pair, etc. You may have to use one or many of these to consistently identify your parts.


    The simplest vision process is probably a 2D single view + GPM. You have a statically-positioned camera looking at your part. GPM stands for geometric pattern match; it basically turns the camera's 2D image into a bunch of edges and tries to match whatever it sees against a pattern you've trained. You set various thresholds on how close a possible match has to be to the originally taught pattern which allows the process to identify (or not) any parts within the field of view.


    The programming for what you're asking can be pretty simple. One way to do it:



    LBL[1] ;
    VISION RUN FIND 'product_one' ;
    VISION GET_NFOUND 'product_one' R[1] ;
    IF R[1]>0,JMP LBL[100] ;
    VISION RUN FIND 'product_two' ;
    VISION GET_NFOUND 'product_two' R[1] ;
    IF R[1]>0,JMP LBL[200] ;
    ! neither product 1 or 2 found ;
    JMP LBL[1] ;


    LBL[100] ;
    ! do something with product 1 e.g. ;
    CALL WELD_P01 ;
    JMP LBL[1] ;
    ;
    LBL[200] ;
    ! do something with product 2 e.g. ;
    CALL WELD_P02 ;
    JMP LBL[1] ;


    Of course there are 1000 ways to accomplish the same thing. If you're completely new to machine vision you may want to consider attending a FANUC training class or spend a few hours with a local expert.

    It's been a while, but I'm pretty sure the pallet pick routine has always been PKPALSTK.TP. The motion segment speeds are hard-coded, so you'll have to modify them directly or replace them with a register reference.


    have you ever watched anybody trying to learn vi(m) from scratch?


    I haven't had the pleasure, but I remember getting pretty frustrated at first. It wasn't until I unmapped my arrow keys that I started really getting comfortable navigating around with h, j, k, l, w, e, $, ^, etc.


    Speaking of off-topic,: if you're a Chrome-user, check out Vimium. Great browser extension for vi(m) users.

    Even if you haven't setup a UFRAME or UTOOL correctly, your local program Positions will save with the currently active UFRAME and UTOOL.


    [list type=decimal]

    • Create new program

    • Press Shift+COORD

    • In the menu that appears in the upper right, highlight TOOL

    • Press 5

    • Press Shift+COORD again

    • Highlight User

    • Press 5

    • Record a position

    • Highlight the position and then hit F5 (POSITION)

    • You should see UF:5 UT:5 at the top

    [/list]


    Does that make sense? I don't think there are any setup options or sysvars you can change to ignore the active UFRAME/UTOOL.


    If you go through the above process again with the actual UTOOL and UFRAME you want, but the positions do not appear to be relative to your UFRAME, you should investigate your UFRAME and ensure it is taught correctly. You can verify this by 1) activating the UFRAME (SHIFT+COORD), 2) going to the Position screen (POSN) and looking at your position in USER coordinates vs WORLD coordinates. They should probably be different.

    Unfortunately local program positions (e.g. P[1], P[2], etc.) are tied to a specific UFRAME and UTOOL.


    If you want the robot to do the same process with respect to two different frames with the same points/program, use Position Registers.


    e.g.:

    ! PROCESS_1.TP ;
    UFRAME_NUM=1 ;
    L PR[1] ...
    L PR[2] ...
    L PR[3] ...


    ! PROCESS_2.TP ;
    UFRAME_NUM=2 ;
    L PR[1] ...
    L PR[2] ...
    L PR[3] ...


    Or better yet...



    ! PROCESS.TP ;
    UFRAME_NUM=R[1:Process] ;
    L PR[1] ...
    L PR[2] ...
    L PR[3] ...


    This assumes that you want any touchups to apply to all of your processes. If you want to use unique Positions for each process, you can copy/paste your program and then use the Frame Offset Utility (covered in this thread: http://www.robot-forum.com/rob…aught-in-wrong-userframe/) to convert the positions from one frame to another.

    When you press SHIFT+COORD, a small menu pops up in the upper-right corner that shows your current TOOL frame, JOG frame and USER frame. You can change any of these by using the arrow keys to highlight the appropriate item and then hit a number key on the keypad to change the setting. (Hit . to activate 10)


    While you're programming, any new local positions will use the currently activated UTOOL and UFRAME.


    To change your UTOOL and UFRAME programmatically, use the UTOOL_NUM= AND UFRAME_NUM= statements. You can find these under the INST > Offset/Frames menu.


    e.g.:
    UFRAME_NUM=2 ;
    UTOOL_NUM=1 ;


    It's good practice to set your UFRAME and UTOOL explicitly in your TP programs. You'll get a UTOOL/UFRAME mismatch error when attempting to move to a local position taught with a different UFRAME or UTOOL, but a Position Register will move to it's position with whatever's active. This can easily cause a crash if you're not careful.


    Hope this helps.

Advertising from our partners