Adam added some code to Poly's teleopPeriodic to time RPMs of motor when lifting. When shooter trigger is held, it waits 0.5 seconds for motor to get up to speed, then takes a 1.0 second timing of encoder ticks, then converts them to motor (not output shaft) RPMs and prints it to the SmartDashboard as "Shooter timed rpms"
With no additional load the motor turns at 12,962 rpms (or so)
With bucket load it's 10,200 rpm. Short of the 14,000 rpm goal Chris set, but he says "It should be fine".
Also noted with bucket that motor draws 23-24 amps
Other stuff - Bryn
I began the drivetrain command code this morning
Put powerpole connectors on the 15 miniCIMS for the drivetrain
Pulled out pneumatic parts for the two robots
Ordered a new compressor
one of the smaller ones for the comp bot
it is almost a lb lighter than what we typically use and pulls 5amp less current
Found a bunch of pneumatic stuff so we didn't have to order any more solenoid blocks and such
Soldered on more 22awg wire and crimped ferrules to the connectors on the double solenoids since they were really short
Challenges
asdf
couldn't figure out where to order the t pneumatic fittings
Work for Next Meeting
Make list of all the general electrical components on the robot (FRC4096 --> 2018 - 2019 --> Build Season --> Electrical folder --> Electrical components)
Build a set-up to run a motor when testing and making prototypes [ <-- hold off on this, we ordered a motor tester - Adam ]
Zach suggests using a three way switch (on forwards, off, on reverse)
One side, the motor is plugged in (two smaller anderson connectors)
A battery is plugged into the other side (large anderson connector on the battery)
Drivetrain Code
Continue working on robot.py, const.py, and oi.py, commands/drivetrain.py & subsystems/drivetrain.py
Goal is to be done by early next week when the drivetrain is supposed to be done
Vision
Drive robot towards vision target so the target is in view of limelight; then have robot drive to a specified distance and turn to a specified angle from the vision target (using a calibrated position and the tx & ty from the limelight)
Get Rotate_To_Angle_Limelight command working on Atlaz. PID will need tuning.
Continue learning how to use limelight
Specifically learn what all of the different values on the network table are and how they can be used
Continue to look into the Limelight documentations
Go to specified range from target (check case studies on limelight website & documentation)
Go to specified range and aim at target (again check case studies)
Line Following (Ordered set of 3 line trackers from Vex, wait for those to arrive before resuming this work. -Adam)
Mount the color sensor to Atlaz. Needs to be only a few centimeters away from floor, pointed at the carpet.
Determine what sensor object to use in code to get values from it. It's an I2C device.
Would be used to follow the white tape that is centered coming out from the spots where hatch panels are placed
Ultimate question is - Should line following be used instead of limelight vision in some places? Together with limelight vision? Or not at all this year?
Talk to other subteams and drivers to determine what semi-autonomous controls are desired --> need list of everything we wish to accomplish and order of priorities --> this has been started, but more work is definitely needed
Will need to finalize these after designs are finalized but should start to get an idea now
Then figure out what the best ways are to implement them
Encoders? Gyros? Limit Switches? Vision? Line Following? etc. etc.