Edit

Note Viewer


Electrical & Code 2019

Attendance

Note Info

Students
Mentors
Date & Time
Location & Author
  • David S.

  • Sam K.

  • Shashank H.

  • 1/14/2019

  • 3 hours

  • Aquiferland

  • Bryn G.

Guests

& Malia & Nolan


Notes

Adam Todo - Create a doc outlining the RoboRio setup steps for our team. Esp. cover the CAN stuff, since existing docs for FRC teams are bad.

Work Completed

  • Began 2019 Code outline --> lots of copying and pasting for 2018
    • robot.py
    • const.py
    • oi.py
    • made files for commands & subsystems but don't have anything in them yet
  • Fixed Frisbee Bot --> front wheels were not working, some motor controller PWM cables were fixed, etc.
  • Fixed Atla-Z --> drivetrain was having some issues
  • Re-wired cables and removed camera from Atla-Z to put limelight in the front of the robot (where the camera previously was)
  • Sam practiced driving
  • Earlier in the day, Adam fixed AtlaZ code for xbox controller, so it works with new release and he figured out the CAN configuration stuff
  • Finished imaging / setting up Poly with 2019 stuff

Challenges

  • Couldn't find a lot of information on using I2C sensors in FRC and with robot py specifically, so spent time moving cables and the limelight on Atla-Z
  • Atla-Z and Frisbee were both broken...
  • Difficulty running a motor to test Cargo prototype since we have no good set up to do it (see #1 in the Work for Next Meeting)

Work for Next Meeting

  1. Build a set-up to run a motor when testing and making prototypes [ <- hold off on this, we ordered a motor tester -Adam ]
    • Zach suggests using a three way switch (on forwards, off, on reverse)
    • One side, the motor is plugged in (two smaller anderson connectors)
    • A battery is plugged into the other side (large anderson connector on the battery)
  2. Drivetrain Code
    • Continue working on robot.py, const.py, and oi.py
    • Begin commands/drivetrain.py & subsystems/drivetrain.py
    • Goal is to be done by early next week when the drivetrain is supposed to be done
  3. Vision
    • Drive robot towards vision target so the target is in view of limelight; then have robot drive to a specified distance and turn to a specified angle from the vision target (using a calibrated position and the tx & ty from the limelight)
    • Get Rotate_To_Angle_Limelight command working on Atlaz.  PID will need tuning.
    • Continue learning how to use limelight
      • Specifically learn what all of the different values on the network table are and how they can be used
      • Continue to look into the Limelight documentations
    • Go to specified range from target (check case studies on limelight website & documentation)
    • Go to specified range and aim at target (again check case studies)
  4. Line Following  (Ordered set of 3 line trackers from Vex, wait for those to arrive before resuming this work. -Adam)
    • Mount the color sensor to Atlaz. Needs to be only a few centimeters away from floor, pointed at the carpet.
    • Determine what sensor object to use in code to get values from it.  It's an I2C device.
    • Would be used to follow the white tape that is centered coming out from the spots where hatch panels are placed
    • Ultimate question is - Should line following be used instead of limelight vision in some places? Together with limelight vision? Or not at all this year?
  5. Talk to other subteams and drivers to determine what semi-autonomous controls are desired --> need list of everything we wish to accomplish and order of priorities --> this has been started, but more work is definitely needed
    • Will need to finalize these after designs are finalized but should start to get an idea now
    • Then figure out what the best ways are to implement them
    • Encoders? Gyros? Limit Switches? Vision? Line Following? etc. etc.

On Schedule?

Yes, since we started drivetrain code yesterday, we should be on track to finish it by the time the drivetrain is done. It would be great to figure out basic limelight stuff on Atla-Z by the end of this week or the next so we will be ready to implement it on the new robot