Does anyone have experience with the OrangeApps KUKA Educational Robot

  • Hello all,

    I am currently working on an internship assignment that uses the OrangeApps (6DOF) Educational KUKA robot.


    I run the KR8.7 Office.lite software on a virtual machine using Hyper-V and as far as I know you use the same software to program for the actual KUKA robot.

    I wanted to start this forum to ask questions now and in the future to anyone that has some experience with the Educational Robot.

    I am currently studying the Robot Manual and the program language and I've found various tutorials on how to use Linear and PTP pathing.

    My ultimate goal is to use vision to detect the location and coordinates of a ball and having the robot "pick-and-place" the ball elsewhere.

    These ae my current inquiries:

    1) Is there a way to manually type and assign coordinates to points without finding the position and using touch-up?

    2) Is there a way for to read from eternal files in KRL (To read and write the coordinates for points)

    Thanks in advance and cheers! ^^

  • MOM

    Approved the thread.
  • No experience with that robot, but from the look of the sales page, my guess is that your KRL program executes in OfficeLite, and "streams" commands to the Lego-bot using a tech package added to OL.

    If I'm correct, then most of the "heavy lifting" will be in OL. OL can definitely access external files under KSS 8. The details are in the CWRITE manual. I've also attached a "library" KRL module I created and use all over that includes routines to Open, Append, and Close files.

    The GlobalData module loads and parses data from a CSV text file.

    My main question would be if OL can "talk" to a vision system, which is outside of my experience.

  • Hello SkyeFire,

    Thank you so much for your reply.

    Your guesses are correct: I use OL and stream commands to a reprogrammed Lego Spike Prime Controller that then adjusts the Robot.

    I genuinely appreciate that you have provided me with these resources and I'll put them to good use.

    In addition I have thought out a method of opperation for the project to give you some more insight.

    1) Use a python (pycharm) script to detect the ball and gather the coordinates

    2) Send the coordinates to a txt. or csv. file on my main OS

    3) Use kukavarproxy (link below*) to create a communication channel between my main OS and the virtual OS OL is built on.

    4) Have OL read the coordinates and ammend positions using the functions you have provided

    5) Run the program with new positions for testing


    If you have any other suggestions I am all ears.

    Thanks again and cheers! ^^

  • It sounds reasonable. KVP is probably the cheapest way to get data between a PC OS and a KRC, without buying EKX.

    In an ideal world, you could get the PC-side KVP to talk directly to the vision system, and skip the file read/write cycle, but that would definitely be a higher programming bar to get over. Maybe something for future development.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account
Sign up for a new account in our community. It's easy!
Register a new account
Sign in
Already have an account? Sign in here.
Sign in Now