2016-05-06
James Slattery
James Slattery5:34 PM

@James Slattery has joined the channel

2016-06-14
James Slattery
James Slattery12:46 PM

@James Slattery set the channel purpose: Code the new website spartronics.onion :D

James Slattery
James Slattery12:46 PM

added an integration to this channel: github

James Slattery
James Slattery12:48 PM

^ Spartronics Github account still needs to be added. Don't know who has those credentials

Clio Batali
Clio Batali10:10 PM

@Clio Batali has joined the channel

2016-06-22
Alex Larson Freeman
Alex Larson Freeman7:34 PM

@Alex Larson Freeman has joined the channel

2016-07-18
Jon Coonan
Jon Coonan2:45 PM

@Jon Coonan has joined the channel

2016-09-13
Enrique Chee
Enrique Chee10:05 PM

@Enrique Chee has joined the channel

2016-09-14
James Slattery
James Slattery10:50 AM

@James Slattery set the channel topic: Things about programming

2016-09-17
Dana Batali
Dana Batali11:24 AM

@Dana Batali has joined the channel

Binnur Alkazily
Binnur Alkazily12:44 PM

@Binnur Alkazily has joined the channel

Michelle Dalton
Michelle Dalton5:26 PM

@Michelle Dalton has joined the channel

James Slattery
James Slattery9:23 PM

@James Slattery set the channel purpose: make robot move

2016-09-18
Riyadth Al-Kazily
Riyadth Al-Kazily11:41 AM

@Riyadth Al-Kazily has joined the channel

2016-09-19
Chris Rininger
Chris Rininger8:39 AM

@Chris Rininger has joined the channel

Chris Rininger
Chris Rininger8:44 AM

Hey, I took the first few weeks of a class on robotics controls / automation this summer before I got too busy to keep up with it. It's a free course on Coursera & they start a new session every 2 weeks. I believe you can sign up for the course and then have access to all the course materials, including many, many videos. I have a feeling the teams that excel at automation are deep into the concepts the course teaches, and I was thinking there might be a way for a group of Spartronics mentors & students to at least survey the course. Here's a link: https://www.coursera.org/learn/mobile-robot --> check it out!

Clio Batali
Clio Batali9:43 AM

Looks cool, thanks!

2016-09-21
Timo Lahtinen
Timo Lahtinen4:22 PM

@Timo Lahtinen has joined the channel

2016-09-23
Chris Rininger
Chris Rininger6:26 PM

I joined a FIRST mentor discussion group at Microsoft, and there's a coffee chat coming up about how to approach automation and how a team can learn & improve. A few people with experience are going to share insights. I'm planning to attend, and if there are any questions / topics anyone would like raised, please let me know.

2016-09-24
Clio Batali
Clio Batali8:12 AM

If anything about the new new radio comes up, any insight would be great! Also thoughts on how to handle the preseason and a larger programming team (this year is looking like we'll have to train about 15 freshmen). Thanks!

Clio Batali
Clio Batali8:14 AM

(@chrisrin: )

Clio Batali
Clio Batali9:09 PM

Okay, update on ARES: the motor controllers were fine after resetting them (12 was acting up today, but 10 was fine), but the launcher module wasn't working. We loaded the most updated code on github to the robot to make sure our versioning was correct, and that solved the problem. We stress tested for a solid hour, and all the relevant autonomous work fine (except the low bar auto - the portcullis arm drops, the launcher repositions to neutral, but the robot doesn't drive at all). A chain fell off (all fixed), and the light positioning needs to be re-calibrated. In all, the robot is functional now! Wiring checked (we'll need to keep an eye on a few things before competition) and that auto needs to be figured out, but ARES is ready to use for out girls' gen meetings.

Clio Batali
Clio Batali9:09 PM

*our

Enrique Chee
Enrique Chee9:11 PM

Thanks for the recap !!

Jack Stratton
Jack Stratton11:16 PM

@Jack Stratton has joined the channel

2016-09-25
Riyadth Al-Kazily
Riyadth Al-Kazily8:52 PM

I found a couple of nice videos describing PID control, in case anyone wants to find out more about what it really is and how it works:
https://www.youtube.com/watch?v=UR0hOmjaHp0
https://www.youtube.com/watch?v=XfAt6hNV8XM

2016-09-27
Jack Stratton
Jack Stratton5:05 PM

sign up and get a free shirt for making 4 pull requests (whether or not they're accepted) to any github project in october https://www.digitalocean.com/company/blog/ready-set-hacktoberfest/

2016-09-28
Chris Rininger
Chris Rininger7:42 AM

Riyadth, those videos are great! Thanks for sharing.

2016-10-01
Lia Johansen
Lia Johansen10:48 PM

@Lia Johansen has joined the channel

2016-10-05
Jeff Dalton
Jeff Dalton3:23 PM

@Jeff Dalton has joined the channel

2016-10-06
Dana Batali
Dana Batali3:25 PM

a nice summary of motors and motor controllers: https://www.youtube.com/watch?v=5thxBgew7N0&feature=youtu.be

Clio Batali
Clio Batali3:28 PM

Thanks for sharing!

Timo Lahtinen
Timo Lahtinen7:56 PM

For next team meeting with programming: Lia and I think that teaching the new team members about git / creating an account, forking, installing eclipse and plugins, and possibly pushing would make for a good first subteam meeting. Mentors - do you have any suggestions? Jack - would you be willing to help teach git?

Riyadth Al-Kazily
Riyadth Al-Kazily8:27 PM

I think that's a good plan, but it might be a lot for the new people to get done in one session. Also, it doesn't get their fingers in the code. Maybe if you skip forking/branching/pushing on the first round, but instead:
1. set up the development environment
2. create a github account and clone the Stronghold repo
3. compile the code and download it to the robot (introduces networking concepts)
4. if they have some idea about Java already, maybe change something in the code and see the change after they download (we might have to come up with a suggested "assignment")

Riyadth Al-Kazily
Riyadth Al-Kazily8:29 PM

That way we could save the Git tutorial for the next time, when they already have a copy of the repository to refer to. You can explain how the programming subteams all work together on the main code by each working in your own repos. New people really need to understand that workflow well, so it's best to teach it to them without also having to go through the hassle of getting all the tools working.

Jack Stratton
Jack Stratton8:38 PM

There must be a way to teach git that a) doesn't confuse people too badly, b) doesn't require homework/reading at home (nobody ever does)

Jack Stratton
Jack Stratton8:38 PM

I can't think of it though...

Jack Stratton
Jack Stratton8:39 PM

(to actually answer your question, I wouldn't mind helping teach git but I told Chee I wouldn't be a leader... I guess that makes me a student mentor then)

Enrique Chee
Enrique Chee8:43 PM

Just because you are not in leadership, does not mean you can't help. Go for it. We need your help. We are a team.

James Slattery
James Slattery8:43 PM

@Jack Stratton code academy came out with a git thing I think

Lia Johansen
Lia Johansen8:54 PM

Riyadth, good idea. Thanks. We will have them work with github and the code to become familiar. Us leaders can explain how we work and communicate throughout the subteams.

2016-10-07
Dana Batali
Dana Batali9:31 AM

i, too, would recommend delaying discussions of git in favor of focus on robot code, eclipse setup, and even “starting from scratch” with robot code (from WPI tutorials)...

2016-10-10
Dana Batali
Dana Batali10:30 AM

one of the central device-types we need to program is the servo motor:
Here is some helpful background on design and control aspects:

http://www.robotplatform.com/knowledge/servo/servotutorial.html

2016-10-12
Binnur Alkazily
Binnur Alkazily8:47 PM

@Lia Johansen you already have the ‘manager’ rights on the google programming group if you want to set that up in the mean time - fyi

2016-10-13
Dana Batali
Dana Batali3:32 PM

hm … these changes are likely to cause us a little more pain this year than last. Not a giant amount, but best for electronics and programmers to read the announcement a couple times.

2016-10-15
Dana Batali
Dana Batali1:51 PM

Here are some slides from a recent nvidia webinar on "deep learning". This is a common approach to machine vision and is "all the rage" right now: http://on-demand.gputechconf.com/gtc/2016/webinar/embedded-deep-learning-nvidia-jetson.pdf

Enrique Chee
Enrique Chee10:27 PM

Enrique Chee
Enrique Chee10:32 PM

Enrique Chee
Enrique Chee10:53 PM

@Enrique Chee pinned their File to this channel.

Enrique Chee
Enrique Chee10:53 PM

@Enrique Chee pinned their File to this channel.

Enrique Chee
Enrique Chee11:42 PM

Here is a link for Java programming in FIRST. Must read for all programmers. https://wpilib.screenstepslive.com/s/4485/m/13809

Enrique Chee
Enrique Chee11:42 PM

@Enrique Chee pinned a message to this channel.

2016-10-16
Binnur Alkazily
Binnur Alkazily10:47 AM

@Lia Johansen please pull the latest code for girls gen.

We fixed Portcullis.java code that had matching ‘ { ‘ issues. Moving forward, we will be dictating coding style guide to minimize this problem.

As a note, the autonomous selection was still wonky. Looking at this year’s code, the logic is more complex than it needs to be - this is an area we will be simplifying as well.

Binnur Alkazily
Binnur Alkazily10:50 AM

… whoever integrated the github bot to this channel, I like it and I am concerned about the amount of noise it will generate when we deep dive into the programming season. And, I am curious enough to see how well it will work. If the channel becomes noisy, we can address that later on.

Jack Stratton
Jack Stratton3:37 PM

@Binnur Alkazily I can move it to a new channel (#commits or something) if it's an issue

Binnur Alkazily
Binnur Alkazily3:38 PM

figured :slightlysmilingface: we can hold off and see how it works when we launch to programming activities. right now, I like having everything in one place!

2016-10-17
Jack Stratton
Jack Stratton10:22 PM

2016 Seattle GNU/Linux conference is on November 11-12 at Seattle Central College, admission and food is free https://seagl.org

2016-10-19
Rose Bandrowski
Rose Bandrowski3:29 PM

@Rose Bandrowski has joined the channel

2016-10-23
Jack Stratton
Jack Stratton10:31 PM

@Binnur Alkazily to be honest, I only made those PRs (and not direct commits) because I wanted a free shirt

2016-10-24
Binnur Alkazily
Binnur Alkazily6:47 AM

:) we should talk about priorities ;-) hopefully it worked :)

2016-10-26
Kate Treviño-Yoson
Kate Treviño-Yoson12:02 PM

@Kate Treviño-Yoson has joined the channel

Olivia Pells
Olivia Pells6:43 PM

@Olivia Pells has joined the channel

Jack Stratton
Jack Stratton7:39 PM

Jack Stratton
Jack Stratton7:39 PM

@Jack Stratton pinned their File to this channel.

Declan Freeman-Gleason
Declan Freeman-Gleason8:16 PM

@Declan Freeman-Gleason has joined the channel

Jeremy Lipschutz
Jeremy Lipschutz8:40 PM

@Jeremy Lipschutz has joined the channel

Adrien Chaussabel
Adrien Chaussabel8:52 PM

@Adrien Chaussabel has joined the channel

Binnur Alkazily
Binnur Alkazily9:06 PM

@Jack Stratton: thanks!

Jack Stratton
Jack Stratton9:18 PM

@Binnur Alkazily I'll write up some stuff for the handbook

Binnur Alkazily
Binnur Alkazily9:18 PM

Love it! And looking forward to reading it :)

Enrique Chee
Enrique Chee9:38 PM

@Jack Stratton: thanks

Ronan Bennett
Ronan Bennett10:44 PM

@Ronan Bennett has joined the channel

2016-10-27
Benjamin Soldow
Benjamin Soldow12:46 AM

@Benjamin Soldow has joined the channel

Michael Nelson
Michael Nelson6:57 PM

@Michael Nelson has joined the channel

2016-10-28
Kaedric Holt
Kaedric Holt6:54 AM

@Kaedric Holt has joined the channel

2016-10-29
Clio Batali
Clio Batali11:18 AM

Fyi - the programming stuff in the robotics room has been moved closer to the door (on the right side of the safety glasses walking in - everything's labeled in drawers/cabinets)

Harper Nalley
Harper Nalley10:12 PM

@Harper Nalley has joined the channel

2016-10-30
Binnur Alkazily
Binnur Alkazily9:47 AM

what is the programming stuff? as a note, we should have a section in a filing cabinet for the licenses

Tom Wiggin
Tom Wiggin1:20 PM

@Tom Wiggin has joined the channel

Clio Batali
Clio Batali4:17 PM

Backup computers, keyboard/mouse, jetson, lights, cameras, kinect, etc.

2016-10-31
Jack Stratton
Jack Stratton3:11 PM

@Tom Wiggin are you MotGit?

Lia Johansen
Lia Johansen8:55 PM

Yeah he is @Jack Stratton

2016-11-01
Tom Wiggin
Tom Wiggin1:18 PM

I found a piece of open source software called Electric in the Ubuntu repository meant for making circuit diagrams and stuff. I heard that we needed a circuit diagram making thing this year so I thought I would help :slightlysmilingface:

Tom Wiggin
Tom Wiggin1:19 PM

heres the website

Binnur Alkazily
Binnur Alkazily10:05 PM

Awesome! Thanks :)

Jack Stratton
Jack Stratton10:26 PM

more coming eventually...

2016-11-04
Tom Wiggin
Tom Wiggin8:34 PM

Why do I need a style guide?

Declan Freeman-Gleason
Declan Freeman-Gleason10:05 PM

So other people can read your code.

Jack Stratton
Jack Stratton11:03 PM

merge conflicts

2016-11-05
Tom Wiggin
Tom Wiggin12:42 AM

Why are voice calls a paid feature?

Tom Wiggin
Tom Wiggin12:43 AM

I'm sure I could set up a Mumble server for our team if we need voice communications

Tom Wiggin
Tom Wiggin12:43 AM

That could be a project we could work on before kickoff!

Tom Wiggin
Tom Wiggin12:50 AM

and we could use Filezilla for unlimited file transfers

Tom Wiggin
Tom Wiggin12:51 AM

Do we have any server machines?

Jack Stratton
Jack Stratton6:25 AM

we have each other's cell phones so normally we just call people

Tom Wiggin
Tom Wiggin5:30 PM

I'm waiting for my raging acne outbreak to disappear first

2016-11-06
Mike Rosen
Mike Rosen9:51 AM

@Mike Rosen has joined the channel

Mike Rosen
Mike Rosen9:58 AM

Has anyone tried getting the simulator "FRCSim" working? I'm following the instuctions on https://wpilib.screenstepslive.com/s/4485/m/23353/l/228979-installing-frcsim-manually . and having limited success: Gazebo comes up and renders the .world and the Robot code is clearly trying to talk to the simulator but the Robot in Gazebo isn't doing anything. I think I've got the attention of the WPI guy who did the Youtube videos so I'm optimistic about making this work. Interest in this from Programming team members?

Charlotte Larson Freeman
Charlotte Larson Freeman10:21 AM

@Charlotte Larson Freeman has joined the channel

Jack Chapman
Jack Chapman10:24 AM

@Jack Chapman has joined the channel

Kenneth Wiersema
Kenneth Wiersema10:26 AM

@Kenneth Wiersema has joined the channel

Tom Wiggin
Tom Wiggin12:54 PM

We have a simulator for that?

Tom Wiggin
Tom Wiggin12:56 PM

Why wasn't I told about this?!

Chris Mentzer
Chris Mentzer1:03 PM

@Chris Mentzer has joined the channel

Jack Stratton
Jack Stratton1:06 PM

@Tom Wiggin we've never used it before; it was just brought up by someone (don't remember who, sorry) at the SkunkWorks workshops yesterday from his own research :)

Tom Wiggin
Tom Wiggin1:07 PM

k

Tom Wiggin
Tom Wiggin1:08 PM

I get notifications on my laptop whenever someone posts anything on slack

Tom Wiggin
Tom Wiggin1:08 PM

its quite neat

Jack Stratton
Jack Stratton1:09 PM

The guy who developed it was helping present yesterday, apparently it has gone from "basically unusable" to "it's very cool if you can flick the right levers" this year

Jack Stratton
Jack Stratton1:09 PM

would be great if you could set it up

Tom Wiggin
Tom Wiggin1:09 PM

What went from unusable to somewhat usable after a sacrifice to the software gods?

Tom Wiggin
Tom Wiggin1:10 PM

was it the simulator?

Jack Stratton
Jack Stratton1:59 PM

The simulator, yeah.

Tom Wiggin
Tom Wiggin2:32 PM

ok...

Tom Wiggin
Tom Wiggin2:33 PM

I had a great idea!

Tom Wiggin
Tom Wiggin2:33 PM

we should all read the daily wtf

Tom Wiggin
Tom Wiggin2:33 PM

then maybe things like this wouldn't happen

Tom Wiggin
Tom Wiggin3:11 PM

If you read it you will know what I mean

Jack Stratton
Jack Stratton3:20 PM

okay well if you're interested in frcsim you should try to set it up, we could probably use someone who knows how to use it on the team this year

Jeff Dalton
Jeff Dalton3:25 PM

On Saturday, I was chatting with the Skunkworks mentor who wrote (part of team?) FRCSim when he was at WPI 2 years ago. He said performance has gotten better in 2 years, but he doesn't know of anyone that uses it to test software. It is mostly used just for demos. On the positive side, it is only loosely tired to Solidworks. Just an export tool. So there is nothing architecturally preventing its use with Fusion 360. He had no enthusiasm for using FRCSim productively as a tool. To me, it sounded like this was a school project for him that was more a demonstration of technology than a tool. That said, I've been mucking with it too, but it sounds like you're further along. I've got Gazebo running, but need to track down some libs for FRCSim.

Jack Stratton
Jack Stratton3:25 PM

most things come from Worcester Polytechnic Institute, see http://wpilib.screenstepslive.com/s/4485 for documentation on FIRST programs

Jack Stratton
Jack Stratton3:26 PM

it takes a lot of mental effort to read that

Jack Stratton
Jack Stratton3:27 PM

I don't know about direct Microsoft ties, but many teams and events are sponsored by Microsoft, and many volunteers are reimbursed by Microsoft.

Jack Stratton
Jack Stratton3:28 PM

I assume someone loosely related to Microsoft worked on FRCSim at some point, why?

Tom Wiggin
Tom Wiggin3:43 PM

"only loosely tired to solidworks" you can edit messages btw

Tom Wiggin
Tom Wiggin3:52 PM

What is gazebo and WPI?

Declan Freeman-Gleason
Declan Freeman-Gleason4:20 PM

WPI: https://wpi.edu
Gazebo: http://gazebosim.org

Jack Stratton
Jack Stratton4:40 PM

Our GitHub page: https://github.com/Spartronics4915/

Jack Stratton
Jack Stratton4:40 PM

@Jack Stratton pinned a message to this channel.

Jack Stratton
Jack Stratton4:40 PM

(ignore, just pinning for later)

Jack Stratton
Jack Stratton4:47 PM

we use it a lot, so get used to it :)

Jack Stratton
Jack Stratton4:47 PM

actually I should probably pin that too

Jack Stratton
Jack Stratton4:47 PM

Tom Wiggin
Tom Wiggin4:47 PM

don't its obvious

Tom Wiggin
Tom Wiggin4:48 PM

too late...

Finn Mander
Finn Mander7:11 PM

@Finn Mander has joined the channel

2016-11-07
Mike Rosen
Mike Rosen8:53 PM

I've spent perhaps four hours over the last two days on "frcsim." Here's what I've found.

There is a simulator FRC robots. That means you can run and debug code entirely on your laptop hardware, without a robot connected. You can even hook up an x-box controller and drive a robot around the screen.

I think it can work well enough to be useful. Here's a 4 minute youtube video of Peter Mitrano (WPI) characterizing a video-game controller and then using it to drive a simple robot: https://www.youtube.com/watch?v=SDk0TW8Xgic&t=126s. I can reproduce this demonstration on my Linux laptop.

That said, it is absolutely "not ready for primetime." It's cutting edge. Expect to bleed when you touch it. Lots of installation woes and unexpected, unexplained behavior.

But again, ... it can be done ... and if you do it, you can be writing test / exploratory programs this week, not when hardware is available (when?). This is my first year with Spartronics but I have to think that a ready test environment would be hugely valuable here. Am I right?

Does anyone want to chase this with me?

Tom Wiggin
Tom Wiggin8:56 PM

I have 7 weeks of math homework to catch up on

Tom Wiggin
Tom Wiggin8:56 PM

I will help you!

Will Hobbs
Will Hobbs9:11 PM

@Will Hobbs has joined the channel

Mike Rosen
Mike Rosen11:08 PM

@Mike Rosen has left the channel

Mike Rosen
Mike Rosen11:10 PM

@Mike Rosen has joined the channel

2016-11-09
Conrad Weiss
Conrad Weiss6:38 PM

@Conrad Weiss has joined the channel

Brian Hutchison
Brian Hutchison8:08 PM

@Brian Hutchison has joined the channel

Sophie Holzer
Sophie Holzer9:05 PM

@Sophie Holzer has joined the channel

Lia Johansen
Lia Johansen9:45 PM

Hey everyone. I have sent an email out that lists the homework. Please read the "Robot programming" and "WPILib overview" on the developers handbook. We will have a short quiz next meeting (11/16/16 3-5 pm). Also if you have not emailed me your github username, please do so soon.

Jack Stratton
Jack Stratton11:23 PM

I'm going to be about 15 minutes late to every meeting that starts at 3:00 fyi

2016-11-10
Lia Johansen
Lia Johansen7:05 AM

@Jack Stratton: thanks for letting us know

Riyadth Al-Kazily
Riyadth Al-Kazily8:51 AM

I went over the robot control system in a bit of a hurry yesterday. I recommend you all check out the FRC documentation on the control system, particularly the hardware overview: https://wpilib.screenstepslive.com/s/4485/m/24166/l/144968-2016-frc-control-system-hardware-overview

Riyadth Al-Kazily
Riyadth Al-Kazily8:52 AM

And remember, that document is part of a treasure trove of information about the overall control system and how to use it: https://wpilib.screenstepslive.com/s/4485

Samantha Rosen
Samantha Rosen12:19 PM

@Samantha Rosen has joined the channel

Marie Sachs
Marie Sachs5:35 PM

@Marie Sachs has joined the channel

2016-11-14
Lia Johansen
Lia Johansen8:12 AM

Hey everyone. Just a reminder not all of you have emailed me your github account. If that is you please do so and email me. Thanks

2016-11-15
Lia Johansen
Lia Johansen8:23 AM

This is a reminder to read "robot programming" and "WPIlib overview" from the developers handbook on github for tomorrows meeting. We will be having a quick quiz. Thanks

2016-11-16
Niklas Pruen
Niklas Pruen10:40 AM

@Niklas Pruen has joined the channel

Dana Batali
Dana Batali2:41 PM

Here is a link to a slide-deck I created called “Intro to FRC programming”. It overlaps with the developer’s handbook, but also has links to example programs and programming exercises. It’s a work in progress and might take a few reads to fully digest. https://docs.google.com/presentation/d/1ZiMBC9y3xrwFk1akdaiVBMLLS6EyY6BSfiTRQo1KlM/edit?usp=sharing

Dana Batali
Dana Batali2:41 PM

2016-11-17
Dana Batali
Dana Batali12:29 PM

@Dana Batali pinned their GSuite Presentation systemoverview|Intro to FRC Programming> to this channel.

Tom Wiggin
Tom Wiggin7:13 PM

marketing is having a heated discussion about team uniforms btw

John Sachs
John Sachs8:04 PM

@John Sachs has joined the channel

2016-11-20
Michael Nelson
Michael Nelson3:22 PM

@Lia Johansen: what's your email, I think all the emails are marked as junk or spam.

Lia Johansen
Lia Johansen3:25 PM

@Michael Nelson:

Tom Wiggin
Tom Wiggin4:47 PM

if you need to see someones email address check their profile

2016-11-28
2016-11-30
Noah Martin
Noah Martin6:52 PM

@Noah Martin has joined the channel

Tom Wiggin
Tom Wiggin9:24 PM

Sorry I couldn't join the november 30th meeting my mother was sick and couldn't drive me to school

2016-12-01
Lia Johansen
Lia Johansen4:08 PM

Hey Everyone,

For our next meeting (12/14/16) please read slides #1-38 on the slideshow linked below. This will help you get a better understanding of FRC programming.

https://docs.google.com/presentation/d/1ZiMBC9y3xrwFk1akdaiVBMLLS6EyY6BSfiTRQo1KlM/edit#slide=id.p

2016-12-02
Tom Wiggin
Tom Wiggin5:15 PM

what does the clone command in git do?

Dana Batali
Dana Batali5:41 PM

clone is the way to bootstrap one repository given another. Typically you do this once per repository. In the slides referenced above, there’s a work-in-progress git tutorial (starting at page 65) that describes these init steps:

1. create github account
2. fork a spartronics repository into your github account
3. clone your fork onto your development machine
4. start working on the clone of the fork

As you might imagine, fork and clone are closely related.

Tom Wiggin
Tom Wiggin5:44 PM

nevermind I found the documentation

Tom Wiggin
Tom Wiggin5:44 PM

thanks

Dana Batali
Dana Batali5:45 PM

btw: programmers shouldn’t worry about git until they complete exercise 1 and 2. This channel is a good place for questions and answers. If you have any questions, fire away! And if you’ve already finished these exercises, please chime in and help other team-members. Thanks!

2016-12-07
2016-12-08
Finn Mander
Finn Mander5:38 PM

Hope it's ok that I ask here: Does anyone have an hdmi capture card that I can use at FLL? They are typically used with recording video games.

Tom Wiggin
Tom Wiggin5:39 PM

I wish

Tom Wiggin
Tom Wiggin5:39 PM

sorry but no I don't have one

Declan Freeman-Gleason
Declan Freeman-Gleason5:44 PM

@Finn Mander: You can't use software recording like OBS?

Alex Larson Freeman
Alex Larson Freeman5:44 PM

pretty sure you need a capture card if you're using a separate device to stream

Finn Mander
Finn Mander5:45 PM

I need an interface to connect the hdmi input. Computers output hdmi but can't input it without a capture card from what i've read. Thanks for your response!

Finn Mander
Finn Mander5:45 PM

No worries, thanks Tom

Declan Freeman-Gleason
Declan Freeman-Gleason5:45 PM

Yeah, you will need a capture card then

Alex Larson Freeman
Alex Larson Freeman5:46 PM

It's worth a try, it seems there is some software you can use

Alex Larson Freeman
Alex Larson Freeman5:48 PM

ah never mind you need to do it through a network

Jon Coonan
Jon Coonan5:49 PM

The school network is not nearly fast enough FYI - we did that last year and it lagged by about 3 seconds @ 600 x 400 resolution

Declan Freeman-Gleason
Declan Freeman-Gleason5:50 PM

I have a network switch, so you could potentially connect them all through ethernet

Declan Freeman-Gleason
Declan Freeman-Gleason5:50 PM

Although that situation is less than optimal

Jon Coonan
Jon Coonan5:51 PM

There are no ethernet ports in the gym

Jon Coonan
Jon Coonan5:51 PM

And that is a long cord anyways

Alex Larson Freeman
Alex Larson Freeman5:51 PM

it would just need to be between the two computers I think

Jon Coonan
Jon Coonan5:51 PM

If I understand Finn’s setup correctly

Jon Coonan
Jon Coonan5:51 PM

It goes like this

Declan Freeman-Gleason
Declan Freeman-Gleason5:51 PM

Yeah, that's the point of the switch

Declan Freeman-Gleason
Declan Freeman-Gleason5:52 PM

Like a mini network to communicate on

Alex Larson Freeman
Alex Larson Freeman5:52 PM

either way the problem with the software is that it's closer to remote desktop and not quite what we want

Alex Larson Freeman
Alex Larson Freeman5:52 PM

There are a couple other options though

Alex Larson Freeman
Alex Larson Freeman5:54 PM

alright this one looks promising: http://spacedesk.ph/

Declan Freeman-Gleason
Declan Freeman-Gleason6:05 PM

@Finn Mander What is the actual setup you have for this like? (Sorry if you got double mentioned)

Tom Wiggin
Tom Wiggin8:07 PM

or we could just use remote login?

Tom Wiggin
Tom Wiggin8:07 PM

what do we need this for?

Tom Wiggin
Tom Wiggin8:09 PM

live stream over LAN right?

Tom Wiggin
Tom Wiggin8:09 PM

VLC has built in network streaming

Finn Mander
Finn Mander8:22 PM

It wouldnt actually be connected to the internet. The screen on the computer would be displaying the camera feed live. We would just have the projector switch between two sources: the computer with the camera, and the score computer

Declan Freeman-Gleason
Declan Freeman-Gleason8:26 PM

Can you connect the camera to the score computer?

Alex Larson Freeman
Alex Larson Freeman8:28 PM

you could probably just switch the input for the score projector

Alex Larson Freeman
Alex Larson Freeman8:28 PM

from the camera to the score computer

Declan Freeman-Gleason
Declan Freeman-Gleason8:30 PM

@Finn Mander: Switching the input might work well

Declan Freeman-Gleason
Declan Freeman-Gleason8:31 PM

I think that's a good idea

Declan Freeman-Gleason
Declan Freeman-Gleason8:31 PM

Although there is a bit of a delay when switching input

Finn Mander
Finn Mander8:31 PM

Yeah but it is an easy way to switch between sources. Besides, we don't have a video switcher available to us

Finn Mander
Finn Mander8:32 PM

It worked pretty well last year. The delay was maybe 2-3 seconds

Finn Mander
Finn Mander8:32 PM

@Alex Larson Freeman: yeah that's what I was hoping to do

Declan Freeman-Gleason
Declan Freeman-Gleason8:41 PM

@Finn Mander: If you can directly connect the camera to the scoring computer then you can use something like OBS (obsproject.org) to switch seamlessly or on a timer, and then you can fullscreen it in preview mode on the projector. That's the only other idea I have, although it might be more trouble than it's worth. We'll see soon, I suppose.

Jack Stratton
Jack Stratton8:42 PM

the original issue was that he didn't have a capture card to do that :)

Declan Freeman-Gleason
Declan Freeman-Gleason8:50 PM

Well, that's the if. I was under the impression that there were two computers and inferred that if the camera was able to connect to one computer than it would be able to connect to the other. Honestly though, I don't really have enough information about the situation to provide very helpful advice, which is why I'm not going to provide any more of it unsolicited.

2016-12-09
Dana Batali
Dana Batali8:32 AM

The axis cameras we have for the robot can stream their output to a computer via a Lan connection (ie via a hub/router)... Not sure if that's of any use for Finn's setup

Finn Mander
Finn Mander9:35 AM

Thanks everyone for your help! I think I may have been overcomplicating the setup. We may be able to simply run an hdmi cable from the camera to the projector and then a hdmi cable from the scoring computer to the projector. Declan you have a great point. Using that software would make a great transition, though I'm not sure I'll be able to find access to a capture card in time to use that.
Thanks for the idea Dana. I would like to avoid network or lan connections since unfortunately the school internet had a really low bitrate.

Finn Mander
Finn Mander9:35 AM

has*

Tom Wiggin
Tom Wiggin9:51 AM

the BYOD network has an awful bitrate

Tom Wiggin
Tom Wiggin9:51 AM

they regular one doesn't

Tom Wiggin
Tom Wiggin9:51 AM

also

Tom Wiggin
Tom Wiggin9:52 AM

ask everyone to stop using the network while you are doing it

Dana Batali
Dana Batali10:57 AM

if you go straight through a router you don’t need to access the byod network at all

Dana Batali
Dana Batali10:57 AM

but best to keep it simple

Tom Wiggin
Tom Wiggin5:47 PM

whats the point of all the fancy security if you can just access it directly through the router?

Dana Batali
Dana Batali7:06 PM

Tom - one can connect multiple computers via one network without going through another network. I assume that you are referring to the BHS network security? That is justified to ensure that the internet isn’t broadly available. In our case, we don’t need access to the internet, we just need one computer to talk to another, ie be in the same network. So we don’t need access to BHS network & the internet.

2016-12-11
Tom Wiggin
Tom Wiggin8:08 PM

?

Jack Stratton
Jack Stratton8:10 PM

in the end, they just used two video cables and the button on the projector remote to switch inputs. (until we got a second projector, anyway)

2016-12-14
2016-12-15
Dana Batali
Dana Batali8:35 AM

Hey programmers - lots of great progress last night! If you get a chance, please do try to work through the examples on your own time. Questions can be posted to this channel! Also, we'll be adding new examples to the set as time allows, so do check in on the slide-set as time permits.

2016-12-17
Michael Nelson
Michael Nelson12:39 PM

@Michael Nelson has left the channel

2016-12-18
Tom Wiggin
Tom Wiggin3:23 PM

Haven't had time to read anything because of christmas celebrations

Tom Wiggin
Tom Wiggin3:25 PM

btw I assumed you meant you could just plug in with an ethernet cable into the router and get internet and printer access without authorization

Jack Stratton
Jack Stratton3:28 PM

ok please read things now that you do :)

Jack Stratton
Jack Stratton3:28 PM

we made a lot of progress, important for everyone to catch up

2016-12-20
Michael Nelson
Michael Nelson12:33 AM

@Michael Nelson has joined the channel

2016-12-22
Tom Wiggin
Tom Wiggin1:53 PM

what is an application stack?

Jack Stratton
Jack Stratton3:03 PM

just in general?

Tom Wiggin
Tom Wiggin3:28 PM

what would you use one for?

2016-12-23
Jack Stratton
Jack Stratton1:27 PM

it's just the list of technologies your application uses. the os, database layer, server framework, client framework... mostly called a stack in web development

Tom Wiggin
Tom Wiggin4:11 PM

still confused

Tom Wiggin
Tom Wiggin4:11 PM

whats a database layer?

Jack Stratton
Jack Stratton4:41 PM

most business programs use other programs to store and fetch data since they're designed for doing it well

Tom Wiggin
Tom Wiggin8:04 PM

server framework?

Jack Stratton
Jack Stratton8:07 PM

I know you were learning Ruby, are you familiar with Rails/Sinatra? (even the concept behind it, not necessarily Rails itself)

Jack Stratton
Jack Stratton8:08 PM

better analogy: WPILib

Tom Wiggin
Tom Wiggin8:08 PM

I gave up a quarter of the way through and did something else

Tom Wiggin
Tom Wiggin8:08 PM

we are using java right?

Jack Stratton
Jack Stratton8:08 PM

a quarter of the way through Dana's presentation?

Tom Wiggin
Tom Wiggin8:08 PM

no through the ruby guide

Tom Wiggin
Tom Wiggin8:09 PM

it was wirtten for a way older version of ruby

Tom Wiggin
Tom Wiggin8:09 PM

oh wpilib?

Tom Wiggin
Tom Wiggin8:21 PM

there we go

2016-12-24
Jack Stratton
Jack Stratton9:31 PM

@Jack Stratton pinned a message to this channel.

2016-12-25
Tom Wiggin
Tom Wiggin3:56 PM

do we have a google classroom?

2016-12-26
James Slattery
James Slattery4:00 PM

I don't think so

Tom Wiggin
Tom Wiggin6:47 PM

k

2017-01-04
Niklas Pruen
Niklas Pruen6:13 PM

will we need a computer today or is it just the kickoff training

Clio Batali
Clio Batali6:14 PM

Nope, just kickoff!

2017-01-09
Brian Hilst
Brian Hilst9:34 PM

@Brian Hilst has joined the channel

2017-01-12
Lia Johansen
Lia Johansen3:47 PM

Hey everyone, please bring your engineering notebooks (per usual) for brainstorming tomorrow

2017-01-13
Declan Freeman-Gleason
Declan Freeman-Gleason5:12 PM

If anyone is interested in the thread I posted asking about light sensors for detecting the gaffers tape in autonomous, here it is: https://www.chiefdelphi.com/forums/showthread.php?t=153561

Lia Johansen
Lia Johansen7:12 PM

Riyadth Al-Kazily
Riyadth Al-Kazily9:07 PM

@Declan Freeman-Gleason That is a good thread, and it's already getting some interesting responses. One thing we may need to consider is how we determine which color tape we are looking for. I seem to remember that once autonomous starts, the FMS informs the robot of alliance color via the network tables. That could be a more reliable way to look for the correct color than a setting made on the driver station (where it could be accidentally set to the wrong color).

Riyadth Al-Kazily
Riyadth Al-Kazily9:09 PM

Maybe we should consider using blue and red LEDs near the sensor to help identify the tape color (that is, illuminate the area under the sensor with light the same color as the tape we are looking for -- that might increase the amount of light reflected by the tape, relative to the carpet). Plus it would look cool to shine blue or red light out from under our robot...

Declan Freeman-Gleason
Declan Freeman-Gleason9:11 PM

Yeah, I think that would look really cool, and probably work well. If we put two color sensors on the robot then we could even determine the angle of the lines.

2017-01-14
Jack Stratton
Jack Stratton9:05 PM

bring your notebook tomorrow

2017-01-15
Jack Stratton
Jack Stratton1:52 PM

git: https://git.io/vMw8y

Jack Stratton
Jack Stratton1:52 PM

@Jack Stratton pinned a message to this channel.

Declan Freeman-Gleason
Declan Freeman-Gleason2:15 PM

Here is the link to the CTRE Toolsuite which includes the javadoc and the actual library: http://www.ctr-electronics.com/control-system/hro.html#producttabstechnicalresources

Riyadth Al-Kazily
Riyadth Al-Kazily2:52 PM

Link to Screensteps WPIlib control system pages: https://wpilib.screenstepslive.com/s/4485

Dana Batali
Dana Batali3:24 PM

serial number for installing FRC Control System (National Instruments driver station): M82X13758 (serial number came with KOP))

Dana Batali
Dana Batali3:25 PM

(only install this if you have windows and think you need driver station on your computer)... This isn't needed for mainstream development

Declan Freeman-Gleason
Declan Freeman-Gleason3:37 PM

Here is the list of sensors we're interested in that I sent to Clio:

Distance Sensor (Different types?)
Color Sensor (Related thread: https://www.chiefdelphi.com/forums/showthread.php?t=153561, get a few kinds)
Camera that Dana Mentioned (https://www.amazon.com/Pixy-CMUcam5-Smart-Vision-Sensor/dp/B00IUYUA80)

Clio Batali
Clio Batali3:49 PM

http://www.schneider-electric.co.uk/en/faqs/FA142566/ Difference between PNP and NPN. Worth checking out!

Adrianna Carter
Adrianna Carter4:40 PM

@Adrianna Carter has joined the channel

Enrique Chee
Enrique Chee5:59 PM

Please. Give list to Robert for me to purchase . Ask captains to explain how we make orders .

Riyadth Al-Kazily
Riyadth Al-Kazily7:32 PM

I have found some information on the Lego NXT color sensor, and I am pretty sure we COULD make use of it on our robot, but it might be a bit of work. Here is a link to what I found: https://www.wayneandlayne.com/bricktronics/design-and-theory/#sensorcolor

One thing we need to understand is how it works. It is basically a light sensor (for white light, which is a mixture of red, green and blue), and includes three LEDs (red, green and blue) to illuminate the thing being sensed. To determine the color, the sensor turns on each LED in turn, and measures the amount of light reflected back. It can send back the color as three numbers representing the red, green and blue content. Note that it must be calibrated in order to work correctly.

One challenge we will have is how it works while we are moving. Since three measurements must be taken to determine the color, we would need the sensor to be over the tape long enough to take at least one reading of each color, and we actually have to be over the tape for at least double that time because we don't know when the measurement cycle starts.

The article didn't say anything about the timing of the cycle in the sensor, so I don't know if it will work. But we should be able to determine how much time we have if we estimate our robot's speed, and use that to determine how many milliseconds it takes to cross the tape on the field. Does anyone in the group know how fast our robot will go? Or how wide the tape is?

Riyadth Al-Kazily
Riyadth Al-Kazily7:38 PM

More information on the Lego color sensor. This page shows a test that determined it takes 2.5ms to make a color reading: http://www.philohome.com/colcomp/cc.htm

Riyadth Al-Kazily
Riyadth Al-Kazily7:39 PM

But that page also seems to indicate the sensor doesn't use I2C interfacing, contrary to the previous page...

Riyadth Al-Kazily
Riyadth Al-Kazily7:52 PM

This sensor looks promising, but I can't tell how far away from the tape we can put it: https://www.adafruit.com/products/1334

Riyadth Al-Kazily
Riyadth Al-Kazily7:58 PM

But it also takes longer to read the color. Looks like a minimum of 2.4ms, and up to 700ms for more accuracy.

Chris Rininger
Chris Rininger7:59 PM

I recall one of the Microsoft mentors said they have had good luck with allen bradley sensors. http://ab.rockwellautomation.com/Sensors-Switches/Color-and-Contrast-Photoelectric-Sensors

Riyadth Al-Kazily
Riyadth Al-Kazily8:09 PM

They look nice. I think this model might be nice: 45CLR-5JPC1-D8. But the only place I could see a price for it wanted $630, which is definitely outside our price range.

Chris Rininger
Chris Rininger8:42 PM

Wow! Maybe they were talking about a different kind of sensor (I think he said something like "3-beam")

Riyadth Al-Kazily
Riyadth Al-Kazily8:48 PM

The one I pointed to has 3 PNP outputs, and you program it with buttons to set an individual output high when it recognizes a specific color. So you point it at the color, push some buttons, and it learns the color. We would just have to look at a digital input for "RED" or "BLUE" colors, and when the input goes high then we are at the color target. It can even work up to 3cm away from the target, which is nice, since we don't want it too close to the ground in case it gets stuck on a loose bit of carpet or something.

Riyadth Al-Kazily
Riyadth Al-Kazily8:49 PM

I suspect it can be had cheaper, but I don't know where.

Jack Stratton
Jack Stratton9:14 PM

it's nice that the tape is either white, red, or blue, and the carpet is green. perfect match for the leads

Riyadth Al-Kazily
Riyadth Al-Kazily9:52 PM

That Vishay sensor is just a chip. We probably need something fancier than that, but maybe not as fancy as the expensive stuff...

2017-01-16
Jack Stratton
Jack Stratton6:52 PM

Everyone using Eclipse: in the repository is an `extra` folder. Go to `Window -> Preferences -> Java -> Code Style` in Eclipse. Match the .xml files in `extra` with the tabs labeled `Formatter` and `Clean Up` -- import them, set as active, ok, ok, ok

Now, every time you make a change, right click on the project on the left side of eclipse and go to `Source -> Format` and `Source -> Clean Up...`

2017-01-17
Binnur Alkazily
Binnur Alkazily8:57 PM

@Jack Stratton nice job!! walking through your repo setup and your git presentation (and thank you, in regards to the placement of {} :slightlysmilingface: )

Jack Stratton
Jack Stratton8:58 PM

it was more of a visual aid to a mostly spoken presentation, but if anyone wants the material as reference, hey, there it is :)

Binnur Alkazily
Binnur Alkazily9:00 PM

next is my favorite question — in the repo — I mean dashboard, can we track version number (can’t see your prior code…)

Binnur Alkazily
Binnur Alkazily9:02 PM

^^^ (clarified, I hope — Riyadth said I didn’t make sense :)

Riyadth Al-Kazily
Riyadth Al-Kazily9:24 PM

Details of the actual field, in photographs.

Binnur Alkazily
Binnur Alkazily9:27 PM

^^^ autonomous team - good resource to checkout for any ideas

Jack Stratton
Jack Stratton9:28 PM

binnur: yeah, I was thinking I'd find someone to copy over last year's buildsystem with

Jack Stratton
Jack Stratton9:28 PM

*at tomorrow's meeting

Binnur Alkazily
Binnur Alkazily9:29 PM

@Jack Stratton that is an awesome idea!!! :slightlysmilingface: please lets make sure there is also a good readme generated in the process for next year!

Declan Freeman-Gleason
Declan Freeman-Gleason9:31 PM

@Jack Stratton: if you end up doing Continuous Integration you can have that tag commits on GitHub with said version numbers

Jack Stratton
Jack Stratton9:32 PM

I finished setting up Travis yesterday, but what we're talking about is having the local buildsystem copy the current git revision to $location on the robot so that it's displayed on the driver station along with who built it and when

Jack Stratton
Jack Stratton9:33 PM

last year we had ant copy some variables to the jar manifest, and read them back. it worked on... 30% of the programmers' computers?

Jack Stratton
Jack Stratton9:34 PM

we need to either fix it or try something else

Binnur Alkazily
Binnur Alkazily9:35 PM

seemed to work whenever I was looking for it :slightlysmilingface: or, I just remember the good stuff :wink:

Declan Freeman-Gleason
Declan Freeman-Gleason9:55 PM

@Jack Stratton: I wasn't offering a solution to that, just putting a thought out there I had relating to versioning in regard to Travis.

Enrique Chee
Enrique Chee9:59 PM

Who is Travis ? Gabe ?

Enrique Chee
Enrique Chee10:00 PM

or what ?

Jack Stratton
Jack Stratton10:05 PM

program to automatically build code on github

Jack Stratton
Jack Stratton10:09 PM

just makes sure that the code on github doesn't have any broken files

Enrique Chee
Enrique Chee10:42 PM

Thanks !

2017-01-18
Timo Lahtinen
Timo Lahtinen4:55 PM

@Timo Lahtinen pinned a message to this channel.

Timo Lahtinen
Timo Lahtinen7:12 PM

@Timo Lahtinen pinned a message to this channel.

Binnur Alkazily
Binnur Alkazily8:52 PM

@Enrique Chee I got a Travis in the office — confused the heck out of me first I read it :slightlysmilingface:

Riyadth Al-Kazily
Riyadth Al-Kazily9:08 PM

Programming team: Awesome work today! I am very impressed how the team has kept on task, got the robot moving, and enabled testing of the launch module! We're ahead of where I thought we'd be by now, which is great. We can work on adding useful features, and making our code robust and easy to debug.

I would appreciate it if every feature team implements some interface on the smart dashboard, either to indicate the status of their module (such as running or not running, or if a jam is detected), or to allow the driver to input a new parameter (such as the speed that the launcher or intake motor is running at). This will help the mechanics test their work, and make it easier to tune the robot for accuracy.

The good news is that we can combine our efforts, and use similar code for each of the features. I recommend everyone do a little research on using the smart dashboard for input and output with their commands, and we can brainstorm the best ways to do it for all the modules.

Riyadth Al-Kazily
Riyadth Al-Kazily9:08 PM

I'm already looking forward to next time!

Binnur Alkazily
Binnur Alkazily9:13 PM

I’ll give a shout out for the SmartDashboard Test Mode — it seems somewhat limited (doesn’t seem to have support for talon srx motors), BUT if we can get it done right, it is a great way to verify electronics work without wondering if your code is the problem... Here is the link:
http://wpilib.screenstepslive.com/s/4485/m/26401/c/92707

Binnur Alkazily
Binnur Alkazily9:14 PM

And, based on last year’s experiment, I believe we are sticking to SmartDashboard (not the SFX v2 version). @Dana Batali @Lia Johansen @Timo Lahtinen please validate.

Jack Stratton
Jack Stratton9:19 PM

One thing making us consider SFX is having a big green/red square to indicate whether the intake was on, as our current version is a toggle button on a joystick. (Though depending on drivers, that might be changed)

Jack Stratton
Jack Stratton9:19 PM

graphics like that are apparently easier

Binnur Alkazily
Binnur Alkazily9:25 PM

K - if that is the case, I recommend starting to be familiar with it sooner than later — setup a user workflow that everyone will follow, that will setup the final driver station layout as we actively develop.

2017-01-19
Dana Batali
Dana Batali9:23 AM

last year, we were pushed toward second-gen smart dashboard, because first-gen didn't support two camera feeds. Additionally there were widgets in the second gen that were sexier (graphs, etc). That said, second-gen caused more problems than it solved. If we can use 1st gen, we should stick with that. If that's not sufficient, we should look at rolling our own, probably via a web/javascript interface. I have links to other teams githubs that have followed this path.

Dana Batali
Dana Batali11:02 AM

http://www.ctr-electronics.com/Talon%20SRX%20Software%20Reference%20Manual.pdf

section 12.4: Velocity Closed-Loop Walkthrough – Java

Dana Batali
Dana Batali12:10 PM

On the topic of CANTalon in speed/velocity control mode... The question all programmers need to consider: what are the units we pass via motor.set() when in this mode? This answer depends entirely on the combination of the CANTalon library conventions AND the quad-encoder's CPM value (this is discussed in the electronic-pneumatics channel). Here's the pivotal table from the CANTalon documentation (table 17.2.2) in the manual above

Dana Batali
Dana Batali12:13 PM

API requirements and Native units

Dana Batali
Dana Batali12:17 PM

Dana Batali
Dana Batali12:26 PM

Another topic: Riyadth & Binnur suggested that we start providing information to the smart dashboard. For the first exercise, I suggest that all Subsystems populate the smart dashboard with the result of the initialization (success or failure). This should be done in your subsystem's constructor and is as simple as this (pseudo code follows):

try
{
... initialization stuff..
}
catch
{
minitialized = false;
}

SmartDashboard.putString("Drivetrain Subsystem", m
initialized ? "initialized" : "disabled");

Dana Batali
Dana Batali12:27 PM

Note: this will not appear on the smart dashboard until the named field (here: 'Drivetrain Subsytem') has been manually added to the smartdashboard on the driver station.

Tom Wiggin
Tom Wiggin9:25 PM

I'm really sorry I missed all the meetings and I feel awful

Tom Wiggin
Tom Wiggin9:26 PM

both my computers and my phone were confiscated for not doing homework and I was absolutely swamped

Tom Wiggin
Tom Wiggin9:28 PM

I missed the eclipse setup and the git tutorial according to Jack

Tom Wiggin
Tom Wiggin9:30 PM

I calculated that if you converted all the missing google classroom assignments I have missing to paper you would have a tower 20 feet high or enough to fill a small closet

2017-01-20
Riyadth Al-Kazily
Riyadth Al-Kazily12:53 PM

Programmers, please join the Slack channels that are related to the subsystem you are working on (if you're not already there...). Those channels are a great place to discuss features and capabilities with the mechanical and electrical teams. I see #intake #launcher and #agitator are ready to go. (And I think #agitator may be related to #launcher, so join them both!)

Riyadth Al-Kazily
Riyadth Al-Kazily12:53 PM

Do any of you on the drivetrain think we need a Slack channel for that? If so, you should ask Clio or one of the other leaders to set it up.

Declan Freeman-Gleason
Declan Freeman-Gleason3:03 PM

@Riyadth Al-Kazily: I don't think we need a channel for that

Dana Batali
Dana Batali3:06 PM

Tom Wiggin
Tom Wiggin5:10 PM

Today I learned that when a piece of software is overloaded and spends the majority of it's time switching between threads that it is "thrashing"

Tom Wiggin
Tom Wiggin5:10 PM

neato

Tom Wiggin
Tom Wiggin5:27 PM

git

Tom Wiggin
Tom Wiggin5:27 PM

made by a bunch of gits for gits

Tom Wiggin
Tom Wiggin5:27 PM

:nerd_face:

Timo Lahtinen
Timo Lahtinen5:47 PM

@Timo Lahtinen pinned a message to this channel.

Tom Wiggin
Tom Wiggin5:55 PM

:rube:

2017-01-21
Dana Batali
Dana Batali10:17 AM

I wonder if it makes sense to migrate the GitHub traffic to another channel, say programming_git, so all the checkins don't get in the way of human conversations.... @Jack Stratton , what say you?

Jack Stratton
Jack Stratton10:51 AM

that integration is only set up for the 2016 repo, but it's easily (re)moved as long as people think it's useful enough to have a team captain create a new channel

Jack Stratton
Jack Stratton10:51 AM

(in my opinion the more valuable traffic is "travis build failed for pull request x", which can also be done but not by myself anymore)

Dana Batali
Dana Batali2:22 PM

programmers: if you are interested in seeing checkins and build status messages go by, wander over to the newly created programming_git channel and join-in.

Dana Batali
Dana Batali2:29 PM

Regarding proper configuration of your build environments on windows, I found that I needed to set some environment variables. On windows you do this via the System control panel -> Advanced system settings -> Environment Variables.

make sure that JAVAHOME is present and points to your jdk install. It might look something like this: C:\Program Files\Java\jdk1.8.091

make sure that PATH has an entry that points to Git.. On my machine that looks like this: C:\Program Files\Git\bin

Dana Batali
Dana Batali2:30 PM

the second setting helps the newly introduce build stamp be more descriptive.

Dana Batali
Dana Batali2:51 PM

To programmers of Commands and CommandGroups. Here are some tips to consider when designing your commands:

* it is recommended that all commands that operate on only a single subsystem follow the naming convention that they start with the subsystem name. Thus: all intake commands should start with Intake, all Drivetrain command with Drive, etc.

- don't rely too heavily on last year's code for structural examples. They are generally too promiscuous (ie: they don't keep private things private).

- consider adding methods to your subsystem that will be shared across multiple commands. Don't break the encapsulation by allowing commands to directly manipulate the motors, but rather make abstractions. For example, the Drivetrain can have a method called driveStraight, parameterized by some notion of power or speed. Today's version of the DriveTrain as the DriveTicksCommand are good references.

- make sure to implement and think through your command's "isFinished" method. Adding debug-level logging to your command should help you to ensure that the lifetime of your command matches your expectations.

- to help with your intuitions, i recommend that you order the methods of your command according to this lifecyle (initialize, execute, isFinished, interrupted, end)

- make sure all your subsystem's methods check for initialized(). If this is done, then your commands should never need to make this check.

- if you have state-change commands, you should consider parameterizing a single command rather than implementing the same logic multiple times... For example, right now, the difference between IntakeOn, IntakeOff and IntakeReverse doesn't seem to justify three different commands and three files. Please refer to the exampleRobot: https://github.com/Spartronics4915/exampleRobot/blob/master/src/org/usfirst/frc/team4915/robot/commands/LifterAutoCtlCmd.java

If any of these items seem mysterious to you any of the programming mentors or leaders should be able to help you with these subtle topics.

Tom Wiggin
Tom Wiggin6:01 PM

wheres our simulator at?

Dana Batali
Dana Batali7:26 PM

we don’t have one

2017-01-22
Clio Batali
Clio Batali11:45 AM

Here’s a running catalogue of all of the motor/encoder assignments on the robot (subject to change): https://docs.google.com/document/d/1UOBnhfhSoBlfsNDcr38CnqWt0hnyTbl4eEsBq7iXnIw/edit?usp=sharing

Riyadth Al-Kazily
Riyadth Al-Kazily11:46 AM

Thanks Clio! Do you happen to know the gear ratio for the drivetrain gearboxes? That would be great to include on that sheet as well.

Riyadth Al-Kazily
Riyadth Al-Kazily11:47 AM

And if I recall correctly, the drivetrain encoders are connected to motor controllers 3 and 4, correct?

Clio Batali
Clio Batali11:47 AM

Off the top of my head, no, but that's written down on the robot at the moment (easy to transfer over today)

Clio Batali
Clio Batali11:48 AM

Yes for the encoders

Binnur Alkazily
Binnur Alkazily5:55 PM

@Binnur Alkazily pinned a message to this channel.

Jack Stratton
Jack Stratton6:19 PM

Right now, the appropriate NetworkTables values are being sent to the

Jack Stratton
Jack Stratton6:19 PM

dashboard but the dashboard mysteriously no longer displays the actual

Jack Stratton
Jack Stratton6:19 PM

radio button chooser.

Jack Stratton
Jack Stratton6:19 PM

Using SmartDashboard SFX, the raw options can be seen in an ArrayView(?)

Jack Stratton
Jack Stratton6:19 PM

but I can't figure out how to show them on the normal dashboard.

Jack Stratton
Jack Stratton6:19 PM

Potential lead: SendableChooser reports its display type as "String

Jack Stratton
Jack Stratton6:19 PM

Chooser", we may have to do that explicitly in LoggerChooser even though

Jack Stratton
Jack Stratton6:19 PM

it extends that method and should report the same.

Jack Stratton
Jack Stratton6:34 PM

I might remove LoggerChooser again and put each logger's name in the dropdown menu choices, I only put LoggerChooser in since the 'real' SendableChooser isn't a NamedSendable for whatever reason, so you can't give it a label automatically.

Binnur Alkazily
Binnur Alkazily6:46 PM

regarding sfx — I recall it being tricky — something about needing to edit the properties of the given module using right click options in edit mode. basically add it to the sfx first, and then configure its options. may/may not help

Binnur Alkazily
Binnur Alkazily6:47 PM

remind me about the output you were seeing on the dashboard: basically the debug levels were showing up correctly, but not the label associated w/ the subsystem, such as intake. correct?

Jack Stratton
Jack Stratton6:48 PM

that's how it started - later on, the buttons disappeared entirely in smartdashboard 1.0 but you could see them in sfx

Jack Stratton
Jack Stratton6:49 PM

if I had a robot I'd explore this over the long week :)

Binnur Alkazily
Binnur Alkazily6:53 PM

:slightlysmilingface: the issue you saw with buttons disappearing could be a caching issue between network tables and smartdashboard — requires to reset roborio and driver station at the same time, if I recall

Binnur Alkazily
Binnur Alkazily6:54 PM

I am trying to remember our test from couple years back when we were playing w/ smartdashboard — I think we (and I believe I was working with you) concluded that adding a string did not work correctly. may relate to your comment on SendableChooser and String issue

Binnur Alkazily
Binnur Alkazily6:55 PM

wonder if you can manipulate the network table directly - from 2016 "settings are persisted on the roboRio in /home/lvuser/networktables.ini"

Binnur Alkazily
Binnur Alkazily6:55 PM

this could be a quick/dirty test on IF what you are thinking will render correctly

Jack Stratton
Jack Stratton6:55 PM

mm, yes

Binnur Alkazily
Binnur Alkazily6:56 PM

(regarding access to the robot — depending on your availability during the week, you could ping the coach for access, unless you have finals too :)

Jack Stratton
Jack Stratton6:57 PM

my finals aren't until march

Jack Stratton
Jack Stratton6:57 PM

@coachchee, could I come in tuesday, thursday, or friday before 11 am?

Binnur Alkazily
Binnur Alkazily7:06 PM

I am not seeing anything out of ordinary with code — so, break the problem down into pieces:
1) verify what you intend can display by seeing if you can manipulate network table directly;
2) see if your code is populating network tables — you may have already tried this: http://wpilib.screenstepslive.com/s/4485/m/26401/l/255424-verifying-smartdashboard-is-working
3) see if putData does what it is suppose to (given our surprise on not everything was rendered correctly in the past) —> just hard wire a menu independent of the subsystems
And, remember SmartDashboard and network tables can get wonky — so, reset both to clear the cache

Binnur Alkazily
Binnur Alkazily7:09 PM

also — make sure any initialization that is needed is done correctly, so SendableChooser is going to the right SmartDashboard instance. It is possible this is going to nowhere — haven’t used this outside of Robot.java before.
hmm… I am not seeing any SendableChooser instantiation — this could be the issue

Jack Stratton
Jack Stratton7:14 PM

:+1:

Enrique Chee
Enrique Chee7:20 PM

Jack, What time ?

Jack Stratton
Jack Stratton7:21 PM

it depends on what your finals schedule looks like, but I need to be leaving at or before 11. 9:30ish work on tuesday?

Enrique Chee
Enrique Chee7:23 PM

I am assuming you need access to robot ? How about 8:30 am Tues, before I start teaching ?

Jack Stratton
Jack Stratton7:23 PM

ok

Binnur Alkazily
Binnur Alkazily7:27 PM

@Niklas Pruen and @Declan Freeman-Gleason — good simple example of Position mode in ChiefDelphi — https://www.chiefdelphi.com/forums/showthread.php?t=153753

Binnur Alkazily
Binnur Alkazily7:33 PM

See Talon SRX Programming guide on: 16.9. Why are there multiple ways to get the same sensor data?

2017-01-24
Jack Stratton
Jack Stratton9:30 AM

Not a fan of the fact that these are all sorted differently

Jack Stratton
Jack Stratton9:30 AM

anyway, thanks binnur! restarting the driver station and the robot at the same time cleared the values out, I wish there was a "empty NetworkTables" button on the driver station console though.

Binnur Alkazily
Binnur Alkazily9:32 AM

You maybe able to work around by deleting the cached files both from the driver station and roborio -- it needs to be done in both places.

Binnur Alkazily
Binnur Alkazily9:32 AM

:slightlysmilingface: welcome to software development -- it NEVER ends!

Jack Stratton
Jack Stratton9:32 AM

I couldn't actually find the files anywhere https://gist.github.com/phroa/722b4a36e37501250c9a10c12f280690

Binnur Alkazily
Binnur Alkazily9:32 AM

Awesome job!

Binnur Alkazily
Binnur Alkazily9:33 AM

So... IF you run through the quick test with those values set, and look at the driver station console logs, do they filter out correctly?

Jack Stratton
Jack Stratton9:33 AM

that's step two :)

Binnur Alkazily
Binnur Alkazily9:34 AM

have you looked at chief delphi? I'll do a search this evening

Jack Stratton
Jack Stratton9:34 AM

briefly

Binnur Alkazily
Binnur Alkazily9:34 AM

I am waiting on your cut/paste of the console output :slightlysmilingface:

Binnur Alkazily
Binnur Alkazily9:34 AM

nice progress :slightlysmilingface:

Jack Stratton
Jack Stratton9:35 AM

I'll have it in a sec, need to actually generate some log messages

Binnur Alkazily
Binnur Alkazily9:36 AM

perfect!

Jack Stratton
Jack Stratton9:47 AM

oh whoops I had the console scrolled up for that screenshot, let me go down to the current log

Jack Stratton
Jack Stratton9:49 AM

sending a screenshot from the driver station sucks

Binnur Alkazily
Binnur Alkazily9:49 AM

WOOT! Looks great!!! And, I see what you mean about the sorting order -- I had to rethink what comes first. @Dana Batali @Lia Johansen @Jack Stratton I suggest we keep our levels short and to the point --> lets remove the 'notice'

Binnur Alkazily
Binnur Alkazily9:50 AM

you mean you are not using your camera to take a screenshot and then slacking? :slightlysmilingface:

Jack Stratton
Jack Stratton9:50 AM

I actually took a look at the smartdashboard code, it sorts the entries in the button list by their hashcode... so we basically can't control ordering

Binnur Alkazily
Binnur Alkazily9:52 AM

interesting! Your code looks great! Lets ship it :slightlysmilingface:

Binnur Alkazily
Binnur Alkazily9:52 AM

other than reboot -- was there any other changes you needed to make?

Jack Stratton
Jack Stratton9:53 AM

yeah, I took out that LoggerChooser thing. (I thought it was the issue at first, but it probably wasn't. I just kept the code simple (no loggerchooser) once it fixed itself with a reboot)

Jack Stratton
Jack Stratton9:54 AM

I'll push an updated version in a second

Jack Stratton
Jack Stratton9:55 AM

we do have a bit of a race condition in that loggers initialized before the new `initLoggers` method in OI won't be filtered according to the buttons on the window, but I can't initialize the loggers after getting the user input because `initLoggers` relies on the loggers being initialized to know what loggers to filter

Dana Batali
Dana Batali9:55 AM

the more we battle with presentation for smart dashboard, the more easy it is for us to justify pynetworktables2js.... Coupled with a widget set like dojo or jqwiidgets, we would actually save time over the kinds of battles Jack is currently fighting (and that we fought last year)...

Dana Batali
Dana Batali9:56 AM

I'll put together a trivial proof of concept and share it at Friday's meeting

Jack Stratton
Jack Stratton9:57 AM

additionally, you can't change the filters without (at a minimum) using the "restart robot code" button on the station since I don't have a good idea of when to poll the sendablechooser for the current values. it seems like you can't register an event listener or anything that simple to run when a new option is selected

Binnur Alkazily
Binnur Alkazily10:06 AM

Unfortunately my day of meetings started -- I'll catch up later. @Jack Stratton ping me directly if you need to grab my attention. And, again , great job!! :)

Chris Rininger
Chris Rininger10:32 AM

Thinking about the climber, I think it's possible there will be three motor states: off, slow (for catching & initially spooling the rope), and fast (for climbing). I'm not sure if that's enough to start working on anything, but just putting it out there.

Jack Stratton
Jack Stratton11:25 AM

@Chris Rininger Oh, so we are doing a climber? great!

Jack Stratton
Jack Stratton11:41 AM

intake team: since we're basically done, want to handle the climber? we can probably copy the intake code and replace Reverse with a half-speed mode. running the motor in PercentVbus at 1.0 and something like 0.3 for the slower speed should be fine, as "1.0 percent" ensures it's drawing as much power as it can take (right?)

Riyadth Al-Kazily
Riyadth Al-Kazily12:01 PM

Yes, 1.0 (100%) is letting the motor draw as much as it wants, which is definitely what you want for the fast climbing rate. For the slower "spooling" rate, we will have to adjust based on the mechanism design and reliability.

Chris Rininger
Chris Rininger12:02 PM

@Jack Stratton Not my call whether or not there will be a climber - that's for the captains to decide. I do think the team will make a run at it, and I've been trying to prompt some crowdsourced analysis and brainstorming over on the climber channel. I shared the thought about states here in case you all think it makes sense to do anything now in anticipation of there being a climber.

Chris Rininger
Chris Rininger12:16 PM

One other thought on control: since the states fall in a repeating sequence (off to slow, slow to fast, and fast to off), the interface could be a single button I suppose. Not sure what's best, but I'm assuming less is more given the number of controls that may be on the station.

Riyadth Al-Kazily
Riyadth Al-Kazily1:04 PM

One big question might be if we can detect when to stop automatically, or if it has to be under driver control. If we can't detect based on a sensor, then we need to make sure the mechanism won't break itself if not used correctly by the driver.

2017-01-26
Jack Stratton
Jack Stratton1:59 PM

regarding https://github.com/Spartronics4915/2017-STEAMworks/pull/17 - we still need to figure out why this doesn't work some of the time. I briefly looked at one log and it said something about exit code 127 being the reason it couldn't get the version string

Dana Batali
Dana Batali2:07 PM

Two data points:

1. the user must have git in their path... On windows this requires one to do use the system control panel, etc. (and depends on how git was installed on the system)
2. there needed to be at least one tag available... I was getting this error when my branch had no tags.

Question for you: did the week2 tag automatically appear, or did you need to explicitly pull it?

Jack Stratton
Jack Stratton2:08 PM

so, I actually sent that PR from my other computer, so I had to run a fresh clone to get the repo. I think that pulled the tag with it.

Dana Batali
Dana Batali2:08 PM

that's good news... Probably a good idea to add a new tag, say week3, to see what effect that has

Jack Stratton
Jack Stratton2:09 PM

let's try that tomorrow

Jack Stratton
Jack Stratton2:09 PM

for what it's worth, week2 is the only tag I have. there's no week1

Dana Batali
Dana Batali2:46 PM

right... week2 is the only tag that's been added i bleev

Dana Batali
Dana Batali3:24 PM

h;m... @Jack Stratton: i'm seeing different behavior on my windows git install. Specifically, unless i do:

git pull upstream master --tags

I don't see the new tags.

I just tested this by creating a new tag from the POV of github, then I did a git pull upstream master (no --tags),

I used git describe --tags to determine state

Jack Stratton
Jack Stratton3:26 PM

'git clone' does pull the tags, 'git pull' on an existing repo doesn't I believe

Jack Stratton
Jack Stratton3:26 PM

I had to use clone since this is a different computer

Dana Batali
Dana Batali3:29 PM

more on this from "git help pull":
By default, tags that point at objects that are downloaded from the
remote repository are fetched and stored locally. This option disables
this automatic tag following. The default behavior for a remote may be
specified with the remote.<name>.tagOpt setting. See git-config(1).

Dana Batali
Dana Batali3:31 PM

(this was from the --no-tags section)

Jack Stratton
Jack Stratton3:32 PM

what happens if you 'git pull upstream' without master? perhaps it's not pulling new tags since master is the only ref it was told to pull

Dana Batali
Dana Batali3:33 PM

since I've now polluted my local repo, I'm not sure I can answer this

Jack Stratton
Jack Stratton3:33 PM

might have to experiment with someone else's tomorrow then

Riyadth Al-Kazily
Riyadth Al-Kazily7:50 PM

Pulling should always pull tags (without any options). However, pushing does not push tags, unless specifically told to do so (git push --tags)

Jack Stratton
Jack Stratton7:50 PM

hmm

2017-01-27
Clio Batali
Clio Batali7:45 PM

http://www.vexrobotics.com/217-5049.html @Jeremy Lipschutz @Brian Hutchison @Ronan Bennett

Riyadth Al-Kazily
Riyadth Al-Kazily10:42 PM

Both the magnetic (217-5049) and optical (amt103) encoders are fast enough for our applications. The magnetic encoder can do absolute positioning to 6600 RPM, and quadrature to 15000, so it could be used to read the output of a CIM directly (and easily handle the agitator). The optical encoder can handle either 7500 or 15000 RPM, depending on the model number. Again, fast enough for direct mounting to a CIM.

2017-01-29
Binnur Alkazily
Binnur Alkazily10:47 AM

team, good overview of PID and Talon SRX. @Declan Freeman-Gleason @Niklas Pruen, see slide #20 on best practices for reversing sensor direction (using reverseSensor() instead of negating sign)
https://docs.google.com/presentation/d/1D8RkpKMOcsGaR1Tjba90VsQoI3Abe81k5sg9Yg9H2o/preview?slide=id.g138e6bb2e62195

2017-01-30
Dana Batali
Dana Batali1:13 PM

team: I took a crack at an even-simpler overview of motor controllers here. If you don't understand the stuff n the above presentation, please peruse this:
https://docs.google.com/presentation/d/1L8-OFV8CBPUkS134OtNrn8gqD0vTJ9to8vfntubdxN0/edit?usp=sharing

Dana Batali
Dana Batali1:13 PM

Dana Batali
Dana Batali1:26 PM

btw: in the slides Binnur referenced, there are no calls in execute... This implies to me that robot safety has been disabled.

Dana Batali
Dana Batali1:27 PM

If you don't know what that means, please refer to my slides on motor safety.

2017-02-01
Tom Wiggin
Tom Wiggin7:35 PM

according to clio the entire electronics board is going vertical

Tom Wiggin
Tom Wiggin7:36 PM

the entire thing

Tom Wiggin
Tom Wiggin7:36 PM

it's insane

2017-02-02
Dana Batali
Dana Batali9:31 AM

programmers of position closed-loop modes: I found another example of closed loop control settings from the ctre people here: http://www.ctr-electronics.com/downloads/pdf/Talon%20SRX%20Software%20Reference%20Manual.pdf, page 83... The example is not for FRC code, but rather their c#/hero platform. The good news is that it's quite readable and has comments explaining many of the magic configuration settings. One mystery i can't explain: when they EnableClosedLoop: they SetVoltageRampRate(0); This may imply that they wish to proceed with no acceleration ramp. They do configure the PeakOutputVoltage to +3,-3, but since we're not talking FRC robots, that might not mean the same thing.

The example does wait 100ms after each call to SetPosition(0);

Brian Hilst
Brian Hilst2:50 PM

@Dana Batali Thanks! Will take a look at it.

Brian Hilst
Brian Hilst3:12 PM

@Dana Batali I was re-reading your slides and have a couple follow-up questions:
1. On slide #17 it mentions setting closed loop peak outputs in the range of -1023, 1023. What method is this referring to? The only methods with “Peak” I see are for voltage.
2. Slide 18 suggests just setting the F gain. Should PID all be set to zero then?

Dana Batali
Dana Batali3:16 PM

#2: my reading is that F is primary useful in velocity control mode. There, one starts with PID=0, F>0... For position mode, I believe one starts with P > 0 and IDF=0.

Dana Batali
Dana Batali3:18 PM

#1, these numbers represent the entire range of "throttle units"... I'm not entirely clear on the connection between throttle and voltage. I might guess that configPeakOutputVoltage may be what we're talking about here..

Binnur Alkazily
Binnur Alkazily5:02 PM

@Brian Hilst: I have shown the calculation between throttle units to rates in my sample code - see the constructor comments for some calculations

Brian Hilst
Brian Hilst5:02 PM

@Binnur Alkazily Ok. We’re testing your code now. First try it ran the motors continuously in opposite directions. Putting in some logging now.

Brian Hilst
Brian Hilst5:03 PM

Moving is better than not!

Binnur Alkazily
Binnur Alkazily5:03 PM

On PID - start simple is always the recommendation - and that translates to P value and then determining if others are needed (usually by observation, like is there oscillation that is occurring that needs correction)

Binnur Alkazily
Binnur Alkazily5:04 PM

Opposite direction means the reverseSensor and reverse motors are incorrect in code - I have a comment on that line

Binnur Alkazily
Binnur Alkazily5:06 PM

(Typing on iPhone is hard - sorry for typos...)

2017-02-03
Dana Batali
Dana Batali9:45 AM

note the negation of values sent to the right motors

Dana Batali
Dana Batali11:10 AM

Autonomous team: to summarize the plan of attack we all discussed last night.

@Binnur Alkazily mentioned that there were 4 options... I believe option #2 was dismissed leaving these three approaches:

1. we continue with our auto-drive-straight approach, controlling two separate motors with two separate control modes. @Brian Hilst @Niklas Pruen might continue to tune their code and integrate it with the recent drivetrain changes.

2. we implement a software PID (atop DriveDistancePIDCmd) that tries to ensure straight driving while achieving a repeatable distance (to within an inch).
2a. to modify the DriveDistancePIDCmd to sample the imu and adjust the rotation to stay on the initial imu heading. (I finally dug up last years' reference for this: 2016-Stronghold/src/..../commands/Drivetrain/DriveStraightCommand.java.
2b. described by @Riyadth Al-Kazily: to set one motor running the hardware PID/Position and to install a software PID that tries to track the primary motor's encoder position,
My current sense it that fixing the DriveDistancePIDCmd is trivial so we should implement that quickly and validate it first thing Sunday. If we find it lacks sufficient accuracy, then we should proceed to 2b. @Declan Freeman-Gleason & @Niklas Pruen might pursue this.

3. to implement a driver-recording/playback mode. There we'd need to sample the series of calls to arcadeDrive with the associated timestamp, then see if we can replay it. @Jack Stratton was going to pursue this plan.

Regarding building toward some actual autonomous programs:

0. a little amount of work remains in order to deliver a drive-to-line command. We need clarification from rule-masters on whether we need to simply break the line with the bumper or to entirely cross the line with the robot. Additionally, this command probably needs to behave differently according to initial position.

1. LT has requested that we put shooting as our highest priority. If we can achieve sufficient accuracy with our driveStraight and our autoTurn commands, then we need to string a few of these into a CommandGroup and measure the accuracy of this "dead reckoning" approach. We discussed the need for two implementations - for the red and blue alliance configurations. If performance-capture proves successful, we simply need to record a few performances from a variety of starting locations and field configurations. Obviously we need to include launcher/agitator control here, so we'll need collaboration from @Jeremy Lipschutz, @Ronan Bennett, @Brian Hutchison .

2. next on the list is the center-position sprocket-delivery. In theory, if we can accurately drive straight, we simply need to measure the precise distance from center-position to the sprocket-delivery location. We need to ensure that the delivery is both accurate and timely: that is we must allow time for the pilot to pull and deposit the sprocket into place. If we can't achieve reliable accuracy with dead-reckoning or performance-capture, we could try to move forward with PixyCam vision.

3. last on the list is the side position sprocket-delivery. Again, if dead-reckoning or performance-capture is sufficiently accurate, it's only a matter of measurement plus managing the variations. And again, if dead-reckong or performance-capture isn't sufficient we'd need to investigate augmenting it with vision.

Please chime in with additional comments, observations, corrections etc!

Brian Hilst
Brian Hilst6:12 PM

One addition that was discussed is using one or two switches on the bumper of the robot to detect contact with the boiler during autonomous. However, it might be sufficient to just program for a little extra distance and watch for the velocity to stop.

2017-02-04
Dana Batali
Dana Batali2:55 PM

here are the details about IP address, mDNS and bandwidth limitations during competition...

http://wpilib.screenstepslive.com/s/4485/m/24193/l/291972-fms-whitepaper

2017-02-05
Brian Hilst
Brian Hilst10:30 AM

Niklas and I will be coming after church later this morning.

Binnur Alkazily
Binnur Alkazily11:03 AM

PID control rules of thumb -- And, here is a good resource on tuning PIDs https://youtu.be/UOuRx9Ujsog

Jack Stratton
Jack Stratton1:33 PM


private static final double turnKp = 0.12;
private static final double turnKi = 0;
private static final double turnKd = 0.30;
private static final double turnKf = 0.001;

Jack Stratton
Jack Stratton1:34 PM

those should be SmartDashboard values

Niklas Pruen
Niklas Pruen5:45 PM

Does anybody now the precision of the IMU? What is the smallest amount of degrees that it can measure?

Niklas Pruen
Niklas Pruen5:54 PM

that's pretty good.. thanks!

2017-02-06
Brian Hilst
Brian Hilst9:18 AM

Has anyone heard about plans to work on the robot today?

Brian Hilst
Brian Hilst9:18 AM

Has anyone heard about plans to work on the robot today?

Brian Hilst
Brian Hilst9:18 AM

Has anyone heard about plans to work on the robot today?

Declan Freeman-Gleason
Declan Freeman-Gleason10:49 AM

I don't think there are any.

2017-02-08
Lia Johansen
Lia Johansen8:21 PM

Jack Stratton
Jack Stratton8:35 PM

So does the diamond plate extend to the boiler or is it a different wall material? :/

Lia Johansen
Lia Johansen8:36 PM

That's what we need to look at in the manual @Jack Stratton

Jack Stratton
Jack Stratton8:37 PM

I wish the GDC would say yes or no in response to that question rather than "yeah it's in the rules"

2017-02-09
Dana Batali
Dana Batali9:42 AM

this picture shows the diamond plate (back wall) continuing into the key. Raises the interesting possibility: we start in the key, shoot 10 balls, then deliver a sprocket to the side delivery spot)

Dana Batali
Dana Batali9:47 AM

and on the other autonomous question: from table 4.1:

AUTO mobility
For each ROBOT that breaks the BASE LINE vertical plane with their BUMPER by T=0

Dana Batali
Dana Batali9:51 AM

more on diamond plate

Jack Stratton
Jack Stratton3:08 PM


public final JoystickButton mturnIMUStart = new JoystickButton(mauxStick, 3);
public final JoystickButton mdriveDistance = new JoystickButton(mauxStick, 4);
public final JoystickButton mdriveDistancePID = new JoystickButton(mauxStick, 5);

public final JoystickButton mreplayRecord = new JoystickButton(mauxStick, 6);
public final JoystickButton mreplayStop = new JoystickButton(mauxStick, 7);
public final JoystickButton mreplayReplay = new JoystickButton(mauxStick, 9);

public final JoystickButton mintakeOn = new JoystickButton(mdriveStick, 7);
public final JoystickButton mintakeOff = new JoystickButton(mdriveStick, 9);
public final JoystickButton mintakeReverse = new JoystickButton(mdriveStick, 11);

public final JoystickButton mlauncherOn = new JoystickButton(mdriveStick, 8);
public final JoystickButton mlauncherOff = new JoystickButton(mdriveStick, 10);

public final JoystickButton mclimberOn = new JoystickButton(mdriveStick, 8);
public final JoystickButton mclimberOff = new JoystickButton(mdriveStick, 12);
public final JoystickButton mclimberSlow = new JoystickButton(mdriveStick, 10);

Jack Stratton
Jack Stratton3:09 PM

is our current button layout. if anything you've written but not yet pushed conflicts, change your buttons!

Jack Stratton
Jack Stratton3:09 PM

additionally, the number of buttons on the two sticks is different! the left (aux) stick only has ten, I think.

Jack Stratton
Jack Stratton3:11 PM

a whole lot of button shuffling will happen later (as we might not even have two joysticks) but right now since we're about to merge everything together we need to make sure only one thing happens with each button.

Jack Stratton
Jack Stratton3:11 PM

notably: climber on and launcher on are both drive stick #8 !

Jack Stratton
Jack Stratton3:11 PM

same with climber slow and launcher off

Chris Rininger
Chris Rininger3:55 PM

That last comment makes me think a modular setup like this could be worth exploring in the future (pardon the domain :grinning:). http://www.beersmith.com/mame/

Dana Batali
Dana Batali7:59 PM

jack: i pointed this out to the launcher team and I understood that they have local (uncommitted) mods. Also, I made a similar comment on noah's pull request (which is still pending)

Dana Batali
Dana Batali8:02 PM

It's probably a good idea for the launcher team to submit a pull request so we can achieve non-conflict, sooner rather than later.

Dana Batali
Dana Batali8:05 PM

chrisrin: i don't think it's necessary/reasonable to expect drivers to actuate more than 3-5 control during a 3min competition. The issue we're currently having is that during software debugging, we need more ways to test a variety of software options and this is the source of of current conflict. That said, even with a need for only 3-5 controls, it might be cool to have them on a bling-y control panel rather than hiding on a gigantic (and confusing) joystick.

2017-02-10
Chris Rininger
Chris Rininger10:28 AM

Thanks - I was just reacting to the idea of using a single control for more than one function. UI-wise, the risk of errors & confusion, especially in the heat of competition, seems high. A modular kit that can be adapted to various needs to create a very clear UI across all functions seems like an interesting option.

Jack Stratton
Jack Stratton10:48 AM

where I think that modular control would be really great is spreading out the controls as a board of buttons (in some kind of order) instead of putting them wherever they fit on a control joystick (especially as we don't even have a movable piece requiring actual joystick movement this year, so it just seems inconvenient)

Jack Stratton
Jack Stratton10:48 AM

preseason project for electronics + programming next year?

Dana Batali
Dana Batali10:51 AM

@Jack Stratton: will you be at today's (weekday) meeting?

Jack Stratton
Jack Stratton10:51 AM

yes

Dana Batali
Dana Batali10:51 AM

excellent

Chris Rininger
Chris Rininger11:49 AM

jack: I could see that being a fun project for a couple people in the off- or pre-season. As far as projects go, I don't think it would be all that difficult to create a set of modules that could plug/play into a driver station using a kit like this... https://gameroomsolutions.com/shop/2-player-led-arcade-control-panel-bundle-kit/

Binnur Alkazily
Binnur Alkazily6:17 PM

I have been envying those teams that have custom driver stations! I am all for this!! Lets bring Spartronics colors and maybe even logo :wink:

Binnur Alkazily
Binnur Alkazily6:35 PM

@Declan Freeman-Gleason here is the CAD for game field in Solidworks
https://www.solidworks.com/sw/education/robot-student-design-contest.htm

Binnur Alkazily
Binnur Alkazily6:35 PM

here is some other versions (not sure what we use… :confused: )
https://www.chiefdelphi.com/forums/showthread.php?threadid=153108

Jack Stratton
Jack Stratton9:26 PM

I'm not a huge fan of how threads aren't expanded by default, I'm not sure that @Ronan Bennett @Brian Hutchison @Jeremy Lipschutz saw this. hopefully you all see it now :)

Brian Hutchison
Brian Hutchison9:31 PM

I'm working on it right now

2017-02-11
Dana Batali
Dana Batali3:26 PM

Dana Batali
Dana Batali3:29 PM

Dana Batali
Dana Batali5:38 PM

2017-02-12
Dana Batali
Dana Batali12:18 PM

Dana Batali
Dana Batali12:25 PM

Dana Batali
Dana Batali12:29 PM

Lia Johansen
Lia Johansen2:20 PM

Binnur Alkazily
Binnur Alkazily2:39 PM

for autonomous (or any accurate shooting), robot needs to be aligned in front of the boiler. Note: boiler is 42” wide, and robot w/ bumpers is 40”.

Binnur Alkazily
Binnur Alkazily3:39 PM

@Declan Freeman-Gleason @Jack Stratton @Niklas Pruen (and @Dana Batali @Brian Hilst) here is what I am thinking about optimizing our command groups for autonomous strategies. Note: I am not attached to any naming, just looking to how we can optimize our command groups for better maintenance.

/
Categories of command groups for autonomous
goal: minimize the maintenance required as we optimize our autonomous moves
/

/
DriveShootCross: from a given location, move to the boiler, shoot,
and optionally cross the baseline
Arguments:
- pass in the starting location, i.e. a set landmark for robot positions
- specify IF using playback motion to move (TRUE | FALSE)
- specify IF crossing the baseline (TRUE | FALSE)

Note: use switch statement to specify move distance and turn values
- note: each of these could be calls to the other command groups or commands

General flow -- may need to add 'waits' in between each activity:
- move+turn and align w/ boiler
- shoot 10 balls
- move+turn to cross the baseline
- stop
/

/

DriveCrossbaseline: from either left or right starting position, move
forward a set distance to crossbaseline and stop

Note: existing drive forward w/ Position mode should be great for this!
/

/

DriveGearDrop: from middle location, drive backwards set distance, and
cross fingers that it will magically align itself w/ lift

Note: experiment w/
- Position mode for driving at low speed
- PID controlled PercentVbus for driving at low speed
/

/

BoilerDriveToGearDrop: after shooting, move to lift to drop off the gear

Note: this is only IF we actually have time left :)
- arguments need to indicate which lift location to move to. We could
assume this is always the nearest to the boiler, but it would need to be
negotiated w/ alliance partners.
*/

Binnur Alkazily
Binnur Alkazily3:47 PM

Please optimize the commands to match our desired strategy priorities

Brian Hilst
Brian Hilst10:41 PM

Looking at the shooter position, can someone clarify where it is currently intended to shoot from? @Binnur Alkazily last stated it needed to be aligned in front of the boiler. Which side of the robot needs to be positioned there?

2017-02-13
Dana Batali
Dana Batali9:31 AM

@Brian Hilst - can you provide an update on when yours and niklas' code will be submitted? Did you request Declan to review it?

Brian Hilst
Brian Hilst9:32 AM

I will submit a pull request this morning.

Dana Batali
Dana Batali10:19 AM

@Riyadth Al-Kazily : regarding bandwidth: we are allowed 7Mbits/sec, control packets consume 100kb/sec, leaving 6.9Mb/sec for camera. (Of course our own smart-dashboard traffic isn't included, so really we have less).

Dana Batali
Dana Batali10:20 AM

Dana Batali
Dana Batali10:21 AM

Dana Batali
Dana Batali10:25 AM

@Riyadth Al-Kazily - here's a reddit thread that may pour some cold water on the notion of two usb cameras: https://www.reddit.com/r/FRC/comments/2syxn7/multipleusbcameras/

John Sachs
John Sachs9:29 PM

@Brian Hilst: the current plan is to shoot up and over the back (gear holder end) of the robot, so you would back the robot up so that its centered and flush with the boiler. That said, the shooter can be adjusted to launch over the front (intake end) of the robot. A couple possible benefits of shooting over the front: the intake would be flush against the boiler rather than exposed to taking hits from other robots playing defense against us, the balls might have a slightly better trajectory, and you may be able to get away with only needing one camera (instead of front and back). Downside is you may lose some ball storage if you shoot over the front (have to keep a lane clear). Needless to say, lots of testing and decisions to be made over the next few days.

Dana Batali
Dana Batali9:33 PM

@John Sachs and @Brian Hilst there is an ongoing thread in launcher that is color coded to help better communicate

Riyadth Al-Kazily
Riyadth Al-Kazily9:56 PM

I'd hardly say cold water... More like a refreshing mist :-)
I found several solutions on-line that look good. I am particularly fond of this system:
https://github.com/RingOfFireOrg/FRC2017Competition/blob/2d3162537542a9c7b3258601be33c2d5122d84a3/src/org/usfirst/frc/team3459/robot/Cameras.java

Riyadth Al-Kazily
Riyadth Al-Kazily9:56 PM

It should also let us address the TCP port issue, which was a great observation in your earlier post (Thanks!)

Riyadth Al-Kazily
Riyadth Al-Kazily9:57 PM

I'll give the two camera solution a spin next time I have access to the robot. Maybe I'll be able to find a student to take the concept and run with it.

Chris Rininger
Chris Rininger11:23 PM

free(ish) stuff: I believe the coach recently acquired some 775pro motors for the team, and I noticed this CD thread that points out armabot.com has provided a free voucher in the kit of parts book for a $25 "Bad Boy" encoder that works that motor - just need to pay $5 to $7 shipping according to the thread - might be worth picking up since we have the motor - here's the thread: https://www.chiefdelphi.com/forums/showthread.php?threadid=153323

Enrique Chee
Enrique Chee11:34 PM

I ordered it already. Clio has it in the electronics section . It came in 2 weeks ago.

2017-02-14
Dana Batali
Dana Batali8:59 AM

the idea of switching between two camera feeds has a few pluses and minuses, but I think the pros probably outweigh the cons:

Pro Switched Camera Feeds:
better use of bandwidth
that would allows us to increase the frame rate or image size
less distracting than two images on the screen

Con Switched Camera Feeds:
requires user to press the switch, more programming time.
less available information to the drivers at any instant

Regarding manpower: since we do have two IP camera solution working right now, we just need to make sure this change isn’t too distracting (or is appropriately distracting:-))

Regarding pros and cons of IP vs USB camera

Pro USB:
fewer wires (no IP hub/switch) no camera power cords
simpler (no additional IP addresses, configuration wizards, etc)
support for a wider, cheaper range of cameras (we have 3 already)

Pro IP:
currently works, no roborio software required
more graceful failure cases since it doesn’t depend upon roborio
* more tested than usb cameras (both for us and globally)

Dana Batali
Dana Batali9:02 AM

So I guess before we press to the next step we want to get buy-in from the student leadership team?

Dana Batali
Dana Batali9:03 AM

@Enrique Chee @Clio Batali - any thoughts on this thread?

Enrique Chee
Enrique Chee9:43 AM

I will have Clio discuss with leadership on Wed at 3 pm. Swtich camera feed or not ? USB vs IP ? I will defer to you and Riyadth since I have not done any research. I suggest you tell Clio what you and Riyadth recommend and let student leadership make final decision. Thanks

2017-02-15
Lia Johansen
Lia Johansen4:10 PM

Since we will not have the real robot until Friday, that puts us back in testing so programming will be meeting Saturday 2/18 from 1 pm - 6pm and meeting early on Sunday 2/19 from 9-1 pm and then staying for the real meeting 1pm-5 pm

Binnur Alkazily
Binnur Alkazily4:10 PM

^^^

Timo Lahtinen
Timo Lahtinen5:06 PM

Dana Batali
Dana Batali5:06 PM

2017-02-16
Dana Batali
Dana Batali8:49 AM

@Jack Stratton, @Binnur Alkazily @Declan Freeman-Gleason

Dana Batali
Dana Batali8:57 AM

regarding auto selection:

- Jack's pull request has been merged... I made a few suggestions regarding display ordering, but that's a cosmetic issue, not a functional one.

- Jack makes the distinction between "Presets" and "Recordings". Presumably Declan will begin to populate the presets options

- I understood from declan that his commandgroups may be parameterized by initial field location. Since this information is provided by the driver, we'll need to convey this to interested CommandGroups/SubCommands via a "pull" from the SmartDashboard. I suggest we adopt the naming convention "AutoPosition" and "AutoPositionOptions" for this SmartDashboard field. I will make the changes to the SmartDashboard to support this. It will be up to Declan to populate AutoPositionOptions and to query "AutoPosition" in his commands. Declan should be able to follow the pattern for AutoStrategy and AutoStrategyOptions.

- I'm not sure how best to specify initial field position. There are more than 3 potential positions that touch the diamond plate and there are also at least two cases where the orientation of the robot matters. (nose vs butt to wall, oriented at an angle so we don't need to move in order to shoot)

- We do know which alliance we're on (through the HAL interface), so we won't need the driver to provide us this information

Jack Stratton
Jack Stratton9:54 AM

We can tack a `.sort()` on https://github.com/Spartronics4915/2017-Dashboard/blob/master/www/js/pgdriver.js#L43 to address your cosmetic issue

Jack Stratton
Jack Stratton9:56 AM

regarding parameterized commandgroups, I would suggest having three presets in the list for each command: Command - 1, Command - 2, Command - 3 or something to represent starting position. it would make for a rather large dropdown menu but I think the decrease in complexity would be worth it. those names would map to commandgroups preinitialized with the starting position as a constructor parameter

Jack Stratton
Jack Stratton9:57 AM

that might get slightly too ugly when we account for potential angles within the starting locations, not sure

Dana Batali
Dana Batali10:47 AM

hm - seems to me that expands exponentially with increasing numbers of starting positions. Perhaps a concrete example of the two scenarios is in order.

and regarding sorting, that sorta works, but never as well as explicit ordering in my experience.

Dana Batali
Dana Batali10:54 AM

Example with a single strategy menu:

Preset Fixed Shoot (Happy Place)
Preset Center Sprocket
Preset Non-Center Line Cross
Preset Fixed Shoot (Happy Place) + Line Cross
Preset Moving Shoot From Position 1
Preset Moving Shoot From Happy Place plus Sprocket Drop

Recorded Fixed Shoot (Happy Place)
Recorded Center Sprocket
Recorded Non-Center Line Cross
Recorded Fixed Shoot (Happy Place) + Line Cross
Recorded Moving Shoot From Position 1
Recorded Moving Shoot From Happy Place plus Sprocket Drop

Dana Batali
Dana Batali10:55 AM

(which, I guess works, but isn't alphabetized)

Dana Batali
Dana Batali10:56 AM

(but I haven't included a position 3 variant, nor have I presented orientation)

Dana Batali
Dana Batali10:57 AM

Conclusion: with this set, I'm now convinced that we don't need/want a separate position menu, since it would imply that position and strategy are orthogonal/independent)

Dana Batali
Dana Batali10:58 AM

in other words, I believe that certain strategies only work from certain positions

2017-02-17
Riyadth Al-Kazily
Riyadth Al-Kazily10:26 PM

CTRE has updated their Toolsuite software (for Talon SRX). Update your installations to avoid unhappiness!
http://www.ctr-electronics.com/control-system/hro.html#producttabstechnicalresources

2017-02-18
Binnur Alkazily
Binnur Alkazily9:44 AM

^^^

Binnur Alkazily
Binnur Alkazily12:42 PM

@Dana Batali which process do I need to kill?? looks like my ctrl-c didn’t kill the processes and closed the sockets cleanly
(master) binnur@Chiana(2017-Dashboard)> python DashboardServer.py
12:40:05 INFO : dashboard: Connecting to networktables at 10.49.15.2
12:40:05 INFO : nt : NetworkTables 2017.0.5 initialized in client mode
12:40:05 INFO : dashboard: Networktables Initialized
Traceback (most recent call last):
File "DashboardServer.py", line 69, in <module>
robotlog = Robotlog.Robotlog()
File "/Users/binnur/Development/Spartronics/2017-Dashboard/pylib/Robotlog.py", line 79, in init
self.udpsock.bind((addr, port))
File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
socket.error: [Errno 48] Address already in use

Tom Wiggin
Tom Wiggin5:24 PM

I will no longer attend meetings until next year because I feel I haven't been very helpful and entering a competition the last thing we need is unhelpful people :slightlysmilingface: I look forward to next year and by then I will hopefully have actually learned some programming!

Tom Wiggin
Tom Wiggin6:16 PM

:hurtrealbad:

Lia Johansen
Lia Johansen6:49 PM

Hey everyone, so a reminder that tomorrow we will be meeting at 1 pm until late. Please bring food for dinner or money to buy dinner. See you then!

2017-02-19
Binnur Alkazily
Binnur Alkazily11:01 AM

@Brian Hilst @Declan Freeman-Gleason the DriveCommandGroup seems pretty straightforward to use. Two things to add:
1. Stop command
2. Print logs so we can see what we asked the robot to do

Note: in code I don’t see anything that logs ‘there is the autonomous command driver has chosen’ and ‘here is the steps we executed’. This is REALLY important as it is very common to hear from drivers ‘hey, I told robot to do xyz, and it didn’t. The only proof we have is what we log, and if that is nothing, then we have to take the word of the driver and trust me, that is hard to debug

Binnur Alkazily
Binnur Alkazily11:02 AM

^^^ we need to make sure this information is printed regardless of if debug filters are turned down.

Binnur Alkazily
Binnur Alkazily11:08 AM

^^^ thinking… we can also omit the stop command, but at the end of the command group, just keep sending ‘stop’ to the drivetrain until autonomous ends.

Declan Freeman-Gleason
Declan Freeman-Gleason11:52 AM

Yeah, that makes sense. I'll be sure to add it.

Brian Hilst
Brian Hilst12:04 PM

@Binnur Alkazily Thanks!

Binnur Alkazily
Binnur Alkazily12:12 PM

@Brian Hilst @Declan Freeman-Gleason — for driving straight, what are we using — 1) position mode w/ IMU, 2) percentVbus w/ IMU, 3) undecided and need to make a decision?
(I am a confused mentor…)

Riyadth Al-Kazily
Riyadth Al-Kazily12:23 PM

@Jack Stratton I am concerned if path recording is inadvertently started by the drivers during competition, there is a chance that we would use a lot of memory by the end of the 2.5 minute match. Will the drivers be aware, and able to turn recording off? Or is there some other safety feature we can put in place, such as only allow recording to be enabled when not in an actual match (using information from the field management system)?

Binnur Alkazily
Binnur Alkazily12:23 PM

todo - in @Declan Freeman-Gleason branch, lets code review public void setControlMode(TalonControlMode m,
double fwdPeakV, double revPeakV,
double P, double I, double D, double F)

Binnur Alkazily
Binnur Alkazily12:24 PM

^^^ configPeakOutputVoltage is applicable to closed loop modes, right?

Jack Stratton
Jack Stratton12:24 PM

riyadth: good point. the recording controls are already located on the alt drive stick (which we shouldn't be touching at all) but I can add something like a 15 second limit or unchecked-by-default checkbox on the dashboard

Binnur Alkazily
Binnur Alkazily12:26 PM

explicit control on the dashboard (check/uncheck recording mode) is more fool proof

Binnur Alkazily
Binnur Alkazily12:26 PM

and should be off by default

Binnur Alkazily
Binnur Alkazily12:27 PM

at least for me :slightlysmilingface:

Riyadth Al-Kazily
Riyadth Al-Kazily12:31 PM

Question for the team. I see that in SpartronicsSubsytem m_initialized defaults to 'true'. This seems odd, as it is set to true before the subsystem is actually initialized. Should this be 'false' by default, and each subsystem then needs to set it to true when initialization is finished?
Otherwise, subsystems will appear to be initialized while they are initializing, which could cause improper execution of methods.

Binnur Alkazily
Binnur Alkazily12:51 PM

We should also talk about timeouts that we can add to commands - it can be useful for turning and stopping us from forever dancing in the field

Binnur Alkazily
Binnur Alkazily4:54 PM

@Riyadth Al-Kazily
04:13:12 NOTICE OI: =================================================
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0xac611458, pid=6192, tid=2885981280
#
# JRE version: Java(TM) SE Embedded Runtime Environment (8.006-b23) (build 1.8.006-b23)
# Java VM: Java HotSpot(TM) Embedded Client VM (25.6-b23 mixed mode linux-arm )
# Problematic frame:
# C [libcscore.so+0x21458] cs::MjpegServerImpl::ConnThread::SendStream(wpi::rawsocketostream&)+0x3b0
#
# Core dump written. Default location: //core or core.6192 (max size 2048 kB). To ensure a full core dump, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# //hserrpid6192.log
## If you would like to submit a bug report, please visit:
# http://bugreport.sun.com/bugreport/crash.jsp
#
➔ Launching «'/usr/local/frc/JRE/bin/java' '-Djava.library.path=/usr/local/frc/lib/' '-jar' '/home/lvuser/FRCUserProgram.jar'»

Lia Johansen
Lia Johansen5:28 PM

Change of schedule: We are meeting tomorrow (Monday) from 1-7 to allow mechanics to work on the robot.

Dana Batali
Dana Batali5:59 PM

@Riyadth Al-Kazily get's the prize! First one to crash the java runtime!

Terry Shields
Terry Shields7:55 PM

@Terry Shields has joined the channel

Binnur Alkazily
Binnur Alkazily8:48 PM

Auto driving and shooting w/ a stuck launcher

Binnur Alkazily
Binnur Alkazily8:50 PM

Dancing robot - shoots and works towards the cross line but wants to dance instead

Binnur Alkazily
Binnur Alkazily8:52 PM

Auto playback moves w/ shooting

Binnur Alkazily
Binnur Alkazily9:58 PM

here is some notes/observations from today’s activities. Anything else I am missing? Any corrections or questions?
- Need to tune PID for IMU turning —> applies to when when backing up from the boiler to cross baseline
- Driving backwards is not incorporating the IMU heading correctly (or applying the correction to the wrong motor?)
- Why are we seeing ‘Output not updated enough?’ stop() writes to both set(0) to motor and arcadeDrive(0,0)
- Autonomous Launcher: should we add a timeout to launch command to avoid situations when launcher doesn’t end for whatever reason so autonomous strategy continues to run?
- Why do we need to keep resetting the robot code?

Rose Bandrowski
Rose Bandrowski9:58 PM

@Rose Bandrowski has left the channel

Clio Batali
Clio Batali10:46 PM

: Master code is still faulty! We're having lots of issues of commands being initiated to OFF - this is the same issue the launcher group experienced today, but now is affecting the launcher, climber, and intake. That said - most things are working, despite launcher jams

Clio Batali
Clio Batali11:02 PM

Also: please set intake to 0.9 NOT 0.75 (where it is now)

Enrique Chee
Enrique Chee11:34 PM

Lia , can you please have someone in programming type out what every button on the controller does . A legend sheet . Thanks

Jack Stratton
Jack Stratton11:39 PM

I thought Clio had one? Here's the code (until a proper list can be made) if it helps. https://github.com/Spartronics4915/2017-STEAMworks/blob/master/src/org/usfirst/frc/team4915/steamworks/OI.java#L54-L90

Enrique Chee
Enrique Chee11:42 PM

lets attach to driver station, thanks

Lia Johansen
Lia Johansen11:46 PM

@Enrique Chee yes. I have them written in my notebook. Will make a cleaner copy tomorrow

Enrique Chee
Enrique Chee11:48 PM

Thanks

2017-02-20
Sean Hooyer
Sean Hooyer11:14 AM

@Sean Hooyer has joined the channel

Sean Hooyer
Sean Hooyer11:17 AM

The top part of the boiler is marked correctly but constructed incorrectly. We can fix it, but we are waiting for permission of the programming team. (2 inches off)

Binnur Alkazily
Binnur Alkazily11:25 AM

@Binnur Alkazily pinned their Image shot2015-02-04at4.45.04_pm.png|Screen Shot 2015-02-04 at 4.45.04 PM.png> to this channel.

Binnur Alkazily
Binnur Alkazily11:36 AM

@Sean Hooyer we need an accurate boiler for testing and tuning. programmers will be in at 1pm today. Please fix ASAP. Thank you!

Binnur Alkazily
Binnur Alkazily11:36 AM

btw - what part is off? height or diameter or ??

Clio Batali
Clio Batali11:38 AM

The top portion is just aligned incorrectly with the base - we will discuss when programmers arrive (should be a quick fix)

Binnur Alkazily
Binnur Alkazily11:38 AM

works! we can work on the turning accuracy to start with

Binnur Alkazily
Binnur Alkazily11:40 AM

@Declan Freeman-Gleason please add a test button so we can test turn right as well as turn left today. Pls program values so that we can simulate and adjust PID values
- turns to boiler from both red + blue alliance
- turns to cross the baseline from both red + blue alliance

Binnur Alkazily
Binnur Alkazily11:41 AM

^^^ trying to avoid extra steps required to reposition and run autonomous code

Lia Johansen
Lia Johansen11:46 AM

@Binnur Alkazily we also need to make sure all of the code on master is working. Right now, it is faulty and unreliable

Binnur Alkazily
Binnur Alkazily11:54 AM

K - please make a list of known issues in prioritized order and assign to teams

Lia Johansen
Lia Johansen11:55 AM

I don't know the faults yet. Clio will inform me when i arrive

Binnur Alkazily
Binnur Alkazily11:55 AM

a checklist of test cases prior to bagging is a good thing to do :slightlysmilingface:

Lia Johansen
Lia Johansen11:55 AM

Launcher fix is first

Lia Johansen
Lia Johansen11:55 AM

When they do that I'll figure out the rest of the priority list

Lia Johansen
Lia Johansen11:55 AM

:+1::skin-tone-3::+1::skin-tone-3::+1::skin-tone-3:

Sean Hooyer
Sean Hooyer12:02 PM

@Binnur Alkazily we fixed the boiler problem

Lia Johansen
Lia Johansen12:09 PM

Thanks @Sean Hooyer

Binnur Alkazily
Binnur Alkazily12:12 PM

Thank you @Sean Hooyer

Declan Freeman-Gleason
Declan Freeman-Gleason1:13 PM

Here's a good article on test mode and PID tuning: http://wpilib.screenstepslive.com/s/4485/m/26401/l/255413-pid-tuning-with-smartdashboard

Jack Stratton
Jack Stratton1:22 PM

@Brian Hutchison @Jeremy Lipschutz try replacing this with `m_launcher.setLauncher(LauncherState.OFF)` https://github.com/Spartronics4915/2017-STEAMworks/blob/master/src/org/usfirst/frc/team4915/steamworks/commands/LauncherCommand.java#L73

Binnur Alkazily
Binnur Alkazily2:20 PM

@Jack Stratton to resolve the version path issue w/ git - is that documented anywhere?

Timo Lahtinen
Timo Lahtinen5:21 PM

For future problem solving, my git path is C:\Program Files\Git\mingw64\bin

2017-02-21
Brian Hilst
Brian Hilst9:28 AM

@Niklas Pruen, @Binnur Alkazily and I completed and tested a new DriveCurveCommand to use in getting from the boiler to the baseline during autonomous faster. It appears to work well and is low risk. The ParameterizedCommandGroup was also updated to support a new “Curve” command and the group now supports variable number of parameters per command (e.g. Shoot takes no params and Curve takes 3).

Here is what remains for today:
1. @Timo Lahtinen @Lia Johansen review and merge the pull request.
2. @Declan Freeman-Gleason @Niklas Pruen Integrate and tune the new “Curve” command into the "Drive Shoot and Cross Baseline Position 3” group. A new "Cross baseline from boiler” was added to help tune the distance.
3. Group needs to determine how best to increase the speed as the default is too slow to make it across the baseline in 15secs reliably. We found setting peak voltage in the Drivetrain.setControlMode() has no effect. Drivetrain setMaxOutput needs to be increased to at least 0.4. Our current thinking is to only do this for this command to avoid impacting other driving commands. We will need to test with the other auto commands to see if it helps get our time down.

Brian Hilst
Brian Hilst9:35 AM

@Timo Lahtinen @Lia Johansen Reminder that we would like to reconfigure the test field down by the 300 building main doors to provide a larger space for autonomous testing. The current location does not allow driving across the baseline. We should also remeasure and verify the key field positions for the boiler, key, baseline & gear lift(s) to make sure our autonomous commands work correctly on the real field and the opportunity for more accurate driving practice.

Lia Johansen
Lia Johansen9:46 AM

@Brian Hilst just merged the code. We can set up the field for you down the hall. Launcher group will be getting the robot first to do some testing and then autonomous will get it for the rest of time. Robot will be in Programming from 1-4 pm

Brian Hilst
Brian Hilst9:47 AM

@Lia Johansen Thanks! @Niklas Pruen and I will go over early to work on moving so we’re ready to test at 1pm

Lia Johansen
Lia Johansen9:47 AM

@Brian Hilst i dont know if doors will be open before 1 pm

Lia Johansen
Lia Johansen9:48 AM

and auto people will get the robot probably around 1:30? 1:45 ish

Brian Hilst
Brian Hilst9:48 AM

@Lia Johansen Ok. It shouldn’t take long to do at 1pm then.

Lia Johansen
Lia Johansen9:49 AM

@Brian Hilst agreed. That's awesome you and niklas got a curve!

Chris Rininger
Chris Rininger9:50 AM

It was awesome to see things come together at the end of last night, and I think the work of the programmers was a big contribution. One thing I noticed on the approach to the climber and other game tasks that would be a major help imo is the ability to turn slowly with finesse and accuracy. It seemed like when a very small turn was needed what happened instead was a large nearly 90 degree turn. On the Xbox controller there is an extra joystick not yet used yet as I understand... could it be used for slower more precise turning? Maybe a way to trigger a tiny pulse of current to nudge the robot with a single press left or right?

Lia Johansen
Lia Johansen9:54 AM

@Chris Rininger Yes, potentially

Lia Johansen
Lia Johansen9:58 AM

@Chris Rininger alex said the turning seemed fine he just needs to practice. Also when the robot was approaching the boiler it was not on the carpet; that contributes to the way the robot turns

Binnur Alkazily
Binnur Alkazily10:00 AM

@Lia Johansen @Riyadth Al-Kazily and I would like to work w/ the launcher team to see if we can tune the launcher values further — we are building some theories to test w/. fyi

Lia Johansen
Lia Johansen10:00 AM

@Binnur Alkazily sounds great. Thanks! Could you test that in 1.5 hrs or less?

Binnur Alkazily
Binnur Alkazily10:01 AM

yup - I really hope so… I don’t think anyone could last tuning for that long and not loose their mind :wink:

Chris Rininger
Chris Rininger10:01 AM

@Lia Johansen ah ok - the robot just seemed a bit lurchy on the in-place turns when trying to line things up, and I assumed it was just a characteristic of tank drive so a nudge left / nudge right function could save cycle time - of course practice will help a lot

Lia Johansen
Lia Johansen10:04 AM

@Chris Rininger Considering a time constraint practice driving will be the best option. Jack got pretty good at turning and driving with auto testing, alex will get there too

Chris Rininger
Chris Rininger10:06 AM

could be good to have for back up drivers that won't get much practice time... time permitting

Lia Johansen
Lia Johansen10:08 AM

@Binnur Alkazily ^^^ what do you say about the slower turning?

Alex Larson Freeman
Alex Larson Freeman10:09 AM

I noticed the turning was different on carpet vs the hallway floor which is what we used for climbing. Not sure if this has been said

Alex Larson Freeman
Alex Larson Freeman10:09 AM

Turning was great on carpet I had no issues there, I just need more practice

Declan Freeman-Gleason
Declan Freeman-Gleason10:09 AM

@Lia Johansen @Binnur Alkazily @Chris Rininger Yeah, I wanted to mention that this might be a non-issue because the robot's movement on the carpet vs. off is very different.

Binnur Alkazily
Binnur Alkazily10:10 AM

I didn’t see issues w/ turning - however, one needs to get adjusted w/ driving using the throttle button down while turning.
the xbox controller, for a mentor, requires more training

Lia Johansen
Lia Johansen10:10 AM

@Binnur Alkazily where is the throttle button?

Lia Johansen
Lia Johansen10:10 AM

Ik where it is on the regular joystick

Binnur Alkazily
Binnur Alkazily10:10 AM

as @Brian Hilst indicated, we should explore cracking up the maxoutputspeed of the robot and see how teleop behaves

Binnur Alkazily
Binnur Alkazily10:11 AM

(on the xbox, to turn, I had to keep pushing the move forward button w/ turns. and that controlled the turning action to better accuracy)

Chris Rininger
Chris Rininger10:12 AM

thx for considering the idea - I would like to move the climbing rope to the carpet today & see how much easier it is for driver to line up - I want those 50 points every match :slightlysmilingface:

Lia Johansen
Lia Johansen10:12 AM

@Chris Rininger agreed! We can do that.

Lia Johansen
Lia Johansen10:13 AM

@Binnur Alkazily outline for our time with robot (1-4 pm). Launching then autonomous then climber

Binnur Alkazily
Binnur Alkazily10:15 AM

and lots of driving :slightlysmilingface:

Enrique Chee
Enrique Chee11:11 AM

Repost from Paul.

Binnur Alkazily
Binnur Alkazily11:37 AM

@Lia Johansen ^^^ pls check w/ programmers on how we can best do this

Binnur Alkazily
Binnur Alkazily11:37 AM

Launcher team, here is what I am thinking for a plan. Please review and adjust as needed. See you soon.
1. Lets make sure talon is setup in code

mlauncherMotor.setAllowableClosedLoopErr(0); // this may cause oscillation we need to observe behavior
m
launcherMotor.setCloseLoopRampRate(0.0); // lets get to our max output quickly
mlauncherMotor.setVoltageRampRate(0.0); // lets get to our max output quickly
m
launcherMotor.enableBrakeMode(false); // defensive programming in case unexpected things happen in configurations

2. Verify FF is set correctly using RIO webpage (our calculations show 0.04995 for 3000RPM)
- zero PID
- set slider to 3000RPM (assuming this is the set RPM we are targeting)
- adjust FF till webpage RPM is 3000
- using this FF set slider to 1500RPM —> verify webpage RPM is ~1500
3. Adjust PID values and test w/ balls to ensure system behaves
- Increase kp till we see oscillation
- Adjust k
i and kd
4. Question: how does isLauncherAtSpeed() works?
5. Once tuned, please set these values in code to make sure they are saved regardless of firmware updates

And, if this process works, please copy and include it as comments in your launcher code for next year.

Note to self: we may need to set i=zone given k
i.

Lia Johansen
Lia Johansen11:38 AM

@Noah Martin please read paul's recommendation

Noah Martin
Noah Martin12:05 PM

@Lia Johansen got it

Binnur Alkazily
Binnur Alkazily12:11 PM

@Noah Martin it maybe worth doing two buttons or same button but w/ hold behavior -- @Jack Stratton @Lia Johansen make sure to weigh in the desired behavior

Dana Batali
Dana Batali12:43 PM

: since we have limited time remaining with the robot, I feel the need to raise a bit of a flag. If things have been working “well enough” we need to be careful not to introduce more risk / variance in our attempts to improve things. Code changes should be vetted extra-carefully at this stage.

Dana Batali
Dana Batali12:50 PM

@Binnur Alkazily - iirc the climber has two buttons (one for slow, one for fast). It is currently possible for the driver to initiate climbing in slow mode, then after a grab to switch to high-speed. There was discussion last night on how/whether we can automate this task. There was also discussion on how to auto-stop. At this late stage, we need to carefully evaluate whether the value of additional climbing automation outweighs the risks. This overlaps with the relative priorities for robot access: juggling between more driver practice, more PID tuning for drivetrain and for launcher, and more automation for climber. If it were up to a vote, I think I’d vote that we devote a little more time to auto strategies and then let the drivers practice, practice, practice.

(i’ve always got 2 cents to spare :-))

Binnur Alkazily
Binnur Alkazily12:55 PM

I will support climber assessment on not needing automation as long as it can be manually managed by drivers safely. I think @Riyadth Al-Kazily will have input on this. Good to hear we already have buttons programmed

Binnur Alkazily
Binnur Alkazily8:38 PM

@Brian Hutchison please post the failed blue run to this slack channel. Thanks.

Binnur Alkazily
Binnur Alkazily8:40 PM

^^oppps! Wrong brian!

Binnur Alkazily
Binnur Alkazily8:41 PM

@Brian Hilst please post the failed blue run auto to this slack channel. Thanks!

Brian Hilst
Brian Hilst8:44 PM

Chris Rininger
Chris Rininger10:58 PM

My 2 cents on climber: manual motor stop seems viable based on testing so far. Just have to make sure the touchpad is full pressed because when motor is cut the rope will slip back to the most recent ratchet position ((maybe 1/8 to 1/4 inch?). I would not prioritize auto-stop on climber over giving drivers more practice time.

2017-02-22
Enrique Chee
Enrique Chee12:53 AM

Did we test the robot with tether ? Don't forget to bring our backup radio ?

Binnur Alkazily
Binnur Alkazily1:32 PM

@Lia Johansen ^^^

Dana Batali
Dana Batali2:34 PM

@Binnur Alkazily, @Brian Hilst @Declan Freeman-Gleason, @Jack Stratton, @Lia Johansen :

I looked into a better refactoring of the auto mode selection process and here's what I came up with.
The commit message spells out how we got to this place. I won't submit a pull request for this unless the collective feels its warranted. It's possible there are alternate approaches and I'd encourage declan to peruse this code and develop an opinion on the why and whether we should make such a change or invest in an alternate approach

https://github.com/dbadb/2017-STEAMworks/commit/b1dfe7982be06f2070ed28e139e7ca3d9cbcc8dc

Lia Johansen
Lia Johansen3:15 PM

@Enrique Chee we did not test with tether. Back up radio is on my list

Binnur Alkazily
Binnur Alkazily7:23 PM

@Dana Batali the code and approach looks good. thank you for making time for it.
@Declan Freeman-Gleason @Jack Stratton @Lia Johansen please review and discuss how you want to approach this. It can easily be tested using our backup robot to validate.

2017-02-25
Jack Stratton
Jack Stratton11:33 PM

dana_batali: I like your changes (especially how much clearer the ParameterizedCommandGroup constructors are when broken up like that), however I feel like there's slight lack of flexibility in adding new strategies with all the string constants (we're done doing that anyway, so whatever) and, minor-ly, `getAllianceScale`'s method body could be a call to `returnForSide(alliance, -1, 1)`.

Jack Stratton
Jack Stratton11:33 PM

I'm going to see what I can put together regarding an object-based parameterized commandgroup; we probably won't be using it - I'm just proving a point

Jack Stratton
Jack Stratton11:33 PM

maybe if (big if) we have extra time in this week's meetings we can do some testing with it on the second chassis

2017-02-26
Dana Batali
Dana Batali10:59 AM

I fully concur that an array of actions is better than an array of pairs of strings..
We do have to tread carefully given the nearness of our first competition

Dana Batali
Dana Batali11:00 AM

But the more pressing problem pertains to how/where to filter the recording list to present a UI before the alliance is known

Jack Stratton
Jack Stratton11:05 AM

honestly, with how well declan's is doing we might consider just deleting each recording. further, the ugly timestamp names can be fixed with just a 'mv <filename> "Strategy Description"'

Jack Stratton
Jack Stratton11:06 AM

(once inside the robot)

Dana Batali
Dana Batali11:54 AM

Certainly, I think we've learned that the recordings grow "stale" over time and so the ones currently present on the robot are likely to be in the stale state. I do think that it's a very handy backup to have this recording capability. And since it's really easy to assert that the file name is the GUI, you're right, all we need to do is choose those names artfully. Still we'll end up with two versions of the alliance-specific recordings. I think we have three choices to help prevent driver error: 1) do nothing, it's not a big deal 2) offer a list of options, then during command construction (at autoInit-time), we could further "specialize" the file name. 3) perform the filtering on the driver station with a snippet of javascript (and a recording naming convention). If, as you suggest, the recording approach may not end up being used, then option #1 is a fine choice.

Jack Stratton
Jack Stratton1:51 PM

I feel good about just leaving two (if any) replays for the driver to pick: `Red 1 Drive - Shoot - Cross Baseline` and `Blue 3 Drive - Shoot - Cross Baseline`

2017-02-27
Brian Hilst
Brian Hilst11:04 AM

Is there a plan for the programming team for the Tues & Weds afternoon sessions this week?

Enrique Chee
Enrique Chee11:38 AM

yes, captains will respond. I hope .

Riyadth Al-Kazily
Riyadth Al-Kazily1:12 PM

CTRE has updated their libraries again. Now up to version 4.4.1.12. The biggest change in the release notes is the use of current measurement, which I think we might be doing in some of our subsystems. We may want to update our workstations and deploy freshly built code to the robot...
http://www.ctr-electronics.com/hro.html#producttabstechnicalresources

Riyadth Al-Kazily
Riyadth Al-Kazily1:13 PM

The release notes:

CTRE Toolsuite 4.4.1.12 Installer
CTRE Toolsuite 4.4.1.12 Installer
Talon SRX Firmware (2.34): Minor modification to start up frame. This will allow for future features (such as ESD detection). This will not impact any current use case of the Talon SRX.
Talon SRX Firmware (2.33): Fixed issue where motion magic halts abruptly due to velocity-to-acceleration ratio exceeding threshold.
Talon SRX Firmware (2.33): Changed Talon SRX current measurement to round instead of truncate.
Talon SRX Firmware (2.31): Signal added to Status 8 for CAN driver status.
Talon SRX Firmware (2.31): Robustness improvement in CAN buffering. This was not necessary to resolve any issues, this was merely an improvement.
Talon SRX Firmware (2.31): Various optimizations in the current-draw measurement. This was not necessary to resolve any issues, this was merely an improvement.
Talon SRX Firmware (2.31): Solved a possible divide-by-zero condition in the current-draw measurement. This did not solve any known or reproducible issues.
Talon SRX Firmware (2.30/10.30): Timing improvements added to correct the regression issues of the previous installer's Talon firmware (X.23).
Talon SRX Firmware (2.30/10.30): The velocity measurement window will automatically truncate to the nearest supported value (1, 2, 4, 8, 16, 32, 64).
For example, if the robot controller attempts to set a window value of '50', the signal value will be truncated to '32'.
Function Limitation: As a result of the performance improvements in the current-draw measurement, the current measurement for a given load may be dissimilar to the measurement when using previous firmware. The difference should not exceed 0.125A, and only occurs near current-draws that are close to a multiple of 0.125A boundary.
Class Library (FRC Java 2017v5): Updated comment headers.
Class Library (FRC C++ 2017
v5): Fixed bug where JNI library was not saving the last error code.
Class Library (FRC C++ 2017v5): Updated comment headers.
Class Library (FRC LabVIEW 2017
v6): Updated Talon Context Help & VI Palette short names.

Lia Johansen
Lia Johansen1:40 PM

@Brian Hilst we will be testing auto (on second robot or first), test robot with the tether, label buttons, creat to-bring lists, planning for competition.

Clio Batali
Clio Batali2:41 PM

@Brian Hilst Programmers will be meeting after the scouting part of the meeting concludes (around 4:30) in order to go through Lia's agenda. The 2nd chassis will be available for you guys to test new code on until the primary robot is ready to be driven (theoretically about 6), at which point drivers and remaining programmers will run through a few more tests/practice. There will be time on Wednesday for packing and driving as well

Clio Batali
Clio Batali2:41 PM

2017-02-28
Declan Freeman-Gleason
Declan Freeman-Gleason6:33 PM

I'll make a pull request to resolve this soon.

Lia Johansen
Lia Johansen6:58 PM

I merged his pull request.

Lia Johansen
Lia Johansen7:51 PM

@Dana Batali : we used your code (seen above in declan's message) and today there were issues where the camera wouldn't switch with the button and sometimes not with the drop down menu. What are your thoughts? And are you going to be at the meeting tomorrow?

Lia Johansen
Lia Johansen8:12 PM

: for tomorrow every programmer does not need to come. Timo, declan, and I will be there. Others do not have to come. We will be working on fixing the camera on the smartdashboard.

Brian Hilst
Brian Hilst8:49 PM

@Lia Johansen: We might also consider falling back to the prior version if we can't readily find the problem. That might be a good place to start to make sure we have a viable option since the cameras are so critical this year.

Lia Johansen
Lia Johansen8:51 PM

@Brian Hilst i agree

Dana Batali
Dana Batali8:55 PM

the dashboard issues are very simple, i guess I'm surprised that the work i offered for the robot code, to fix autonomous selection (and also reduce the likelihood of "RobotDrive not updated enough"), don't appear to have been pulled?

Dana Batali
Dana Batali8:56 PM

that is, my dashboard changes weren't that big of a deal (notwithstanding declan's discovery of a typo).

Dana Batali
Dana Batali8:56 PM

my big concern is for autonomous selection... this is what jack and I were discussing earlier on this thread. Lia, Timo - did you deem these unuseful?

Dana Batali
Dana Batali8:57 PM

(btw: sorry for the smartdashboard typos, i didn't think anyone would pull these beside me)

Riyadth Al-Kazily
Riyadth Al-Kazily8:58 PM

lia_johansen: Do both cameras work when loaded in separate web pages? I recall that Dana made shortcuts in Chrome (bookmark bar) on the drivers station.

Dana Batali
Dana Batali8:58 PM

I will attend tomorrow's meeting if you feel that I'll be helpful.

Lia Johansen
Lia Johansen9:00 PM

We did not try that

Lia Johansen
Lia Johansen9:53 PM

@Dana Batali i believe declan just tested your code from your branch. The selection seems to be working fine. I think it would be helpful that you attend for a bit tomorrow if possible

2017-03-01
Dana Batali
Dana Batali7:57 AM

@Lia Johansen - i'll try to be there around 3:30pm

Lia Johansen
Lia Johansen8:02 AM

@Dana Batali thanks!

2017-03-02
Lia Johansen
Lia Johansen12:56 PM

: hey everyone. We have seem to have lost a school laptop charger. Have any of you accidentally brought it home or placed it somewhere?

Lia Johansen
Lia Johansen3:32 PM

: for competition , only @Declan Freeman-Gleason @Brian Hutchison @Timo Lahtinen @Jack Stratton need to bring laptops. Others do not

Binnur Alkazily
Binnur Alkazily4:27 PM

note - @Declan Freeman-Gleason needs to be powered for his laptop -- please make sure we have extension cord as it maybe hard to find plugs

2017-03-04
Declan Freeman-Gleason
Declan Freeman-Gleason9:40 AM

The length of the key is 155 inches

Declan Freeman-Gleason
Declan Freeman-Gleason9:41 AM

The length from the baseline to the hopper button base is 10 inches

Declan Freeman-Gleason
Declan Freeman-Gleason9:42 AM

The length from the diamond plate to the gear spring not including the part that sticks out ~3in is 110in

2017-03-05
Declan Freeman-Gleason
Declan Freeman-Gleason6:26 PM

: Get agitator working again
: Make gear placement easier for drivers
: Make an autonomous program that aligns with the gear
: Make autonomous atoms faster

Lia Johansen
Lia Johansen6:37 PM

Agitator is probably mechanics but we will test

Declan Freeman-Gleason
Declan Freeman-Gleason6:38 PM

I put it as first priority just in case

Declan Freeman-Gleason
Declan Freeman-Gleason7:39 PM

Also a CAN self check

Declan Freeman-Gleason
Declan Freeman-Gleason7:39 PM

Is probably a good idea

2017-03-06
Binnur Alkazily
Binnur Alkazily9:23 PM

@Declan Freeman-Gleason good list, thank you!

Binnur Alkazily
Binnur Alkazily9:32 PM

@Lia Johansen please add following to programmer’s todo list
- ‘testing for launcher’ w/ autonomous and w/ teleop
- discuss and gather requirements from drivers (alex+will) on how best to provide driving controls for gear placement — also review how/if they are using the camera for gear placement —> I noted several teams w/ light ring around their camera @Dana Batali thoughts?
- discuss w/ drivers on the speed controls for intake, possibly a button to shift down the robot speed —> noted that as our robot moves fast w/ intake, it also has a tendency to push balls away. can we improve the efficiency of intake by driving slower?

Binnur Alkazily
Binnur Alkazily9:39 PM

@Lia Johansen / @Declan Freeman-Gleason — autonomous program for the gear will require some work, including possibly using the backup robot for testing (given time available w/ the actual bot). please make sure to review w/ leadership team for the priorities and plans on how to schedule this work

Clio Batali
Clio Batali9:40 PM

A couple of quick comments on that list: the back camera wasn't used for the majority of competition because it was flakey/not working (I discussed with Declan, and he seems to have a fix in mind). As for intake, once it was physically fixed after having the bottom bar snap if half, our intake was the smoothest it's ever been - it looked like we were just eating up balls from the ground! Though we may want to fine-tune speeds, this indicates to me that the majority of tweaks with the intake are mechanical

Clio Batali
Clio Batali9:41 PM

Also, the agitator problems are 100% mechanical/motor based, not software

Lia Johansen
Lia Johansen9:42 PM

@Binnur Alkazily i will bring up these points in the upcoming leadership meeting. Thank you.

Binnur Alkazily
Binnur Alkazily9:45 PM

@Lia Johansen @Riyadth Al-Kazily indicated that the FTA recommended using Firefox for roborio dashboard — lets remove the visible links to IE from our driver station and replace w/ Firefox for default browser. Riyadth indicated the Firefox interface was definitely more responsive

Lia Johansen
Lia Johansen9:46 PM

Yeah, it definitely is, will do @Binnur Alkazily

Binnur Alkazily
Binnur Alkazily9:47 PM

happy to de-prioritize/remove IE from daily use :slightlysmilingface:

Chris Rininger
Chris Rininger9:52 PM

There is also the Edge browser :slightlysmilingface: - seems like Firefox is best for this application

Dana Batali
Dana Batali10:21 PM

hold it, we are supposed to be exclusively using Firefox. If anyone is using anything else, it has definitely not been approved by the "camera team".

Dana Batali
Dana Batali10:22 PM

@Riyadth Al-Kazily, @Binnur Alkazily @Clio Batali @Declan Freeman-Gleason : were there any signs that we were using anything other than firefox during the competition?

Binnur Alkazily
Binnur Alkazily10:23 PM

@Dana Batali this is specifically for the roborio — as the IE was still in the toolbar, that is what I started when we were troubleshooting the CAN bus — this is why I am suggesting removing IE from any visible toolbars and replacing w/ firefox as default

Dana Batali
Dana Batali10:24 PM

@Binnur Alkazily: ah, hadn't gotten that, thanks for the clarification. I guess we would need to install silverlight plugin for firefox

Binnur Alkazily
Binnur Alkazily10:26 PM

they used firefox w/ roborio at the competition — so, sounds to me like it works off the shelf

2017-03-07
Paul Vibrans
Paul Vibrans5:32 AM

@Paul Vibrans has joined the channel

Paul Vibrans
Paul Vibrans5:36 AM

I just watched 1574 in the Haifa match. Their autonomous goes straight to the first hopper to get more balls then back to the boiler to shoot 30+ balls before auto ends. Their shooter rate is faster than ours.

Paul Vibrans
Paul Vibrans6:21 AM

I got that wrong, they shoot from the hopper location so must have a two position shooter.

Binnur Alkazily
Binnur Alkazily7:27 AM

If I recall, we are at a 2 balls a sec in our shooter and our current autonomous takes about 13.5 sec. To speed anything up will require time w/ the robot to tune it correctly (including setting up a realistic field) . Given that, I recommend optimizing around points and getting closer to the gear delivery during autonomous a better strategy then trying to optimize for more shooting points. @Lia Johansen can follow up w/ leadership team on priorities and focus.

Lia Johansen
Lia Johansen7:37 AM

I was talking with Brian H, and he said he can make the shooter speed up more quickly and save us around 15 seconds for the match

Lia Johansen
Lia Johansen7:38 AM

I'll bring that up at the meeting

Paul Vibrans
Paul Vibrans7:39 AM

I took a closer look at 1574's match record and found they are a good gear boy as well. They can only load balls from a hopper so after two attempts at shooting they switch to gears. It took them half way through quals to get dialed in but then in the finals they could get a whole hopper load in during auto. For our strategy a gear option and a fuel option would be good. The human players on the airship are limiting in a gear only strategy.

Dana Batali
Dana Batali8:15 AM

lia_johansen: we discussed this in the car ride home and we agreed that it’s not a “shoe-in”. That is: brian would need significant time with the robot and can’t guarantee that increasing the speed won’t also decrease accuracy. If the priority is high enough and expectations are tempered it would be something worth exploring. I currently suspect that getting back to where we were in Auburn is the highest priority and this might be followed by addressing the giant numbers of dropped gears that we experienced. These activities represent scheduling difficulties with respect to launcher speedups in that robot access is a precious resource in the next 2 weeks.

Lia Johansen
Lia Johansen8:18 AM

That makes sense. I will still ask what their highest priority is and what we as programmers can do to help

Dana Batali
Dana Batali8:19 AM

And I agree with @Binnur Alkazily that an auto gear-drop would be worth spending programming time on.

Dana Batali
Dana Batali8:19 AM

(indeed we definitely want programming to assist/support the LT to deliver on their priorities!)

Lia Johansen
Lia Johansen8:20 AM

I will give a update after our meeting wednesday. I put binnur's points into my "presentation" for tomorrow.

Dana Batali
Dana Batali8:27 AM

binnur: i understood from clio that the back camera was flaky and didn’t offer much value during the entire match. I wasn’t aware of this and we could probably have fixed it, but it all turned out well anyway :slightlysmilingface:.

Regarding light rings, i believe a couple teams (squirrels, …) had a vision assist for gear delivery. This is easier to implement with a mecanum wheel-base, so I would think that it might be beyond our means. Toward the end of the match, Alex was getting much better at gear-delivery, so my guess is that getting the camera working reliably is the priority there. This also overlaps with the gear-pickup problem since visibility on the opposing side was very low. Physical enhancement to the gear holding system (perhaps broadening the holder into a Y shape) would also be nice.

Binnur Alkazily
Binnur Alkazily10:39 AM

Agreed that the human players are the critical path for gear delivery during auto. My goal would be to reduce the time it takes for delivery right after auto (basically do the next action faster) which would gain us about 2-3 secs in heading to the hopper to start our ball cycle

Jack Stratton
Jack Stratton10:39 AM

go for a side peg, then back out into a hopper?

Binnur Alkazily
Binnur Alkazily10:41 AM

If u trust we can do a successful delivery, yes. But I am thinking just sit at the gear delivery station and complete that action right after we start teleop vs what we do now (drive to gear, position and drop and then move to hopper).

Chris Rininger
Chris Rininger11:27 AM

One thing to consider - more strategy I suppose but relevant: the difference between the number of gears for 3 rotors vs. 4 rotors is so large that 3 rotors is (or will become) pretty easy for 2 gearbots to achieve while 4 rotors is very rare even with 3 gearbots (given traffic). This means it is possible that more times than not that our alliance will not even need our pre-loaded gear to get the three rotors, and it may make sense to not drop off that gear at the beginning & instead focus on fuel immediately (increasing opp for 3 rotors + 3 climbs + the ranking point). And then if it seems like that pre-loaded gear is critical to achieving 3 rotors, THEN drop it off (& possibly sacrifice opp for the 40 kpa / ranking point).

Chris Rininger
Chris Rininger11:28 AM

later in match I mean

Riyadth Al-Kazily
Riyadth Al-Kazily4:00 PM

I like that strategy. I do agree that the 6-gear requirement for the last rotor is very difficult for most teams, and I believe that we had great success this past weekend because our shooter did give us a point advantage over the other alliance when both sides only had three rotors running. I believe that it is also a valid strategy to get three rotors running (only), and don't collect any more gears (all bots on the alliance). Instead, use the fuel points as the tie-breaker, and spend any remaining time in the match in a defensive mode, to prevent the other alliance from getting 4 rotors running.

Riyadth Al-Kazily
Riyadth Al-Kazily4:00 PM

I think these comments should be re-posted to the strategy channel. How do we do that?

Paul Vibrans
Paul Vibrans4:07 PM

After watching a bunch of matches in other venues I would say there is a common thread of intense defense after the 3rd rotor is turning if the opponents only have two going. There are a number of standard defensive patterns that seem to be evolving.

Dana Batali
Dana Batali4:56 PM

riyadth: i just shared three threads into strategy… To do this you use the “share” button which looks like a upward sweeping arrow

Riyadth Al-Kazily
Riyadth Al-Kazily5:00 PM

Thanks for the tip! And for sharing the messages.

2017-03-08
Dana Batali
Dana Batali8:49 AM

nvidia announced their newest jetson, the tx2, since they are a sponsor, i thought it was worth sharing and having a few students to read about it including marketing, @Jon Coonan

https://developer.nvidia.com/embedded/buy/jetson-tx2-devkit

Clearly this will have no impact on this year's robot.

2017-03-09
Chris Rininger
Chris Rininger12:09 PM

I saw a CD thread a while back that revealed there are numerous teams that include a separate monitor for video from the robot on their drive stations, and it made me wonder how usable our video feed is on the laptop monitor for Alex driving. Sight lines are often pretty bad I would guess given the size of the airships, meaning the robot video feed must be used. Would a small separate monitor on the drive stations for video feed from robot help? It may not be terribly difficult to add (says I, who knows very little about how the drive station is set up).

Declan Freeman-Gleason
Declan Freeman-Gleason2:08 PM

It shouldn't be difficult to add, someone should ask Alex what he thinks.

Alex Larson Freeman
Alex Larson Freeman3:31 PM

Honestly the only thing I need to look at on the screen is the camera view, and that's pretty much what's on there during the match, plus there isn't really room for another monitor. Cool idea though

Dana Batali
Dana Batali3:33 PM

i suspect we can make the image a little larger (10%-ish) if that seems worthwhile. Of course higher priority is just to get the back-view camera working reliably, which I understood that it didn't in Auburn.

Declan Freeman-Gleason
Declan Freeman-Gleason3:37 PM

@Alex Larson Freeman When the login prompt for the camera showed up during the competition did you just press cancel?

Declan Freeman-Gleason
Declan Freeman-Gleason3:39 PM

@Dana Batali Just fyi: The login prompt was showing up because my changes to fix that never got pulled. Should we make the batch file do that when it gets run and just guarantee that upstream is always working? Or maybe just an out-of-date message on the dashboard?

Dana Batali
Dana Batali3:42 PM

@Declan Freeman-Gleason : i hadn't understood that it was a simple as the login prompt. If that was the failure condition, then we hope that your firefox-only-fix will resolve the problem. We won't know without some real validation, but it does seem quite likely... But if it wasn't the failure condition, then we have more diagnostic activities ahead. I think it was @Clio Batali who was tasked with setting up the camera feeds, so let's make sure to get her feedback on this.

Alex Larson Freeman
Alex Larson Freeman3:43 PM

Honestly clio was the one dealing with the cameras since I was helping will set up the robot

Declan Freeman-Gleason
Declan Freeman-Gleason3:58 PM

@Dana Batali It may not be that simple, but it's certainly what I think we should try first.

Declan Freeman-Gleason
Declan Freeman-Gleason3:59 PM

@Clio Batali When the login prompt for the camera showed up during the competition did you just press cancel?

Dana Batali
Dana Batali3:59 PM

Without more info from clio, that seems like the clear choice.

Declan Freeman-Gleason
Declan Freeman-Gleason3:59 PM

I agree

Dana Batali
Dana Batali4:00 PM

@Declan Freeman-Gleason: do you know that the login prompt was presented?

Declan Freeman-Gleason
Declan Freeman-Gleason4:04 PM

The login prompt shouldn't be hidden, login credentials are either cached, or the browser shows the prompt.

Declan Freeman-Gleason
Declan Freeman-Gleason4:05 PM

However, this process seems to be poorly documented and not very transparent so I'm just going off of observations here.

Dana Batali
Dana Batali4:07 PM

We we do know that Clio was aware of this requirement, though perhaps not as experienced with the firefox variant... That's why I'm inclined to suspect a different problem. I'll stop speculating at this point... :wink:

Clio Batali
Clio Batali7:50 PM

I was required to log in periodically for the first day as expected, with a dialogue box and all on the 10.49.15.13 page. Later, about halfway through Saturday, when attempting to connect to that page nothing relating to that camera would load (though it looked like it was trying). Once some sort of prompt did show up on the web dashboard that seemed to authenticate the connection to the d-link from there, and I believe it worked for that match, but that was shortly before the camera stopped working.

Declan Freeman-Gleason
Declan Freeman-Gleason9:19 PM

Alright, I suppose we'll just need to test and stop speculating then.

2017-03-14
Paul Vibrans
Paul Vibrans7:25 AM

I watched an Australian bot at the Southern Cross tournament get 40+ fuel points in six out of seven qualifying matches with a front of the boiler shooter like ours. Their auto routine started with the bot angled parallel with the key line and touching the wall with one corner. When it starts it follows the key line to the hopper release and squares up to the wall as balls pile in. Then it backs up, turns parallel to the wall and moves to the boiler where it turns 45 degrees and drives to the boiler front. It's shooter has more dispersion than ours but it was getting over ten auto points every time. Can our robot be programmed to do this for the regionals?

Declan Freeman-Gleason
Declan Freeman-Gleason8:22 AM

@Paul Vibrans I had this very thing in mind and I took measurements on the field at Auburn for it, but our autonomous probably needs to be faster. Do you have the team number or a video?

Paul Vibrans
Paul Vibrans8:41 AM

The team is 4613. They should be on streaming video from the Southern Cross Regional starting 3:00 PM our time.

Paul Vibrans
Paul Vibrans9:00 AM

I looked at the match results for 4613 at the Shenzhen Regional, which they won, and the Southern Cross Regional, where they are currently number one, and see that their teleop scores are generally lower than ours and their autonomous scores are two to three times ours because of more balls to shoot. The difference in teleop may be a function of a more gear centric strategy after auto. I still think our shooter is more accurate because of less dispersion of the shots.

Lia Johansen
Lia Johansen5:39 PM

Here is the list of positions for open house tomorrow. If your name is not here, you may help with greetings and directions.
Show second Chassis: Lia, Timo
Launcher: Brian, Ronan, Jeremy
Autonomous: Declan

Enrique Chee
Enrique Chee8:29 PM

Enrique Chee
Enrique Chee8:29 PM

Will Hobbs
Will Hobbs8:38 PM

Here is a youtube link for the autonomus shown above https://www.youtube.com/watch?v=k2pWR593tqI

2017-03-15
Dana Batali
Dana Batali2:08 PM

the new beaglebone seems nearly equivalent (more powerful in some ways - built-in IMU) than the roborio... https://www.arrow.com/en/products/bbblue/beagleboardorg

2017-03-16
Jack Stratton
Jack Stratton7:36 PM

binnur: (#strategy) every button on both the xbox and the joystick is in use; we'd need either the third joystick or to replace the (currently duplicated but that might be for a reason) intake buttons on the joystick

Jack Stratton
Jack Stratton7:36 PM

or maybe the keyboard?

Binnur Alkazily
Binnur Alkazily7:37 PM

keyboard…! like that idea, easy to map?

Jack Stratton
Jack Stratton7:37 PM

it seems quite hard to map

Binnur Alkazily
Binnur Alkazily7:38 PM

K - how about button on the dashboard to start the command?

Jack Stratton
Jack Stratton7:41 PM

we could try that, I'd have to look and see how command buttons are implemented in the normal dashboard

Jack Stratton
Jack Stratton7:42 PM

I don't think there's a similar feature natively in pynetworktables

Jack Stratton
Jack Stratton7:43 PM

(I think the easiest would be to temporarily plug in the spare xbox controller while we're recording and just remove it when we're done)

Binnur Alkazily
Binnur Alkazily8:07 PM

I like keeping it easy and simple

Binnur Alkazily
Binnur Alkazily8:08 PM

adding a smartdashboard button that calls the command should be simple — but maybe confusing with the two dashboards…

Jack Stratton
Jack Stratton8:21 PM

I'll look in to it

2017-03-17
Dana Batali
Dana Batali10:01 AM

i expect that pynetworktables is sufficient to the task of adding a button. Another approach would be to add a new mode selector (perhaps on the dev page), that triggers button reassignments.

Dana Batali
Dana Batali10:17 AM

A quick perusal suggests that a sendable command creates a named subtable that follows a specific convention. Here's the source for putData:

public static void putData(String key, Sendable data) {
ITable dataTable = table.getSubTable(key);
dataTable.putString("~TYPE~", data.getSmartDashboardType());
data.initTable(dataTable);
tablesToData.put(data, key);
}

where the key is the name we select for the command

and here is the implementation of Command::initTable

public void initTable(ITable table) {
if (mtable != null) {
m
table.removeTableListener(mlistener);
}
m
table = table;
if (table != null) {
table.putString("name", getName());
table.putBoolean("running", isRunning());
table.putBoolean("isParented", mparent != null);
table.addTableListener("running", m
listener, false);
}
}

Now, from a pynetworktables point of view, the only issue is that subtables aren't explicitly supported, we just need a path-name to the individual subtable elemants (like /SmartDashboard/OurCommand/running), etc.

Dana Batali
Dana Batali11:06 AM

The other related topic is Buttons/Triggers (which we instantiate in OI.java).
We could theoretically simulate a button-press in our webapp... Buttons are also represented in network tables and are "polled" by the scheduler. The important table sub-field is called "pressed".
There is a class called "NetworkButton" that spells it out.

2017-03-18
Paul Vibrans
Paul Vibrans5:11 PM

I just watched match videos of Skunkworks and saw that they must back away from the wall at the loading chute to make the gears fall properly. The back away seems uniform from try to try as if it is a programmed move like autonomous. Is something like this possible or helpful for us?

Lia Johansen
Lia Johansen7:30 PM

Hey programmers, autonomous is going to work on the driving part of the improved auto. @Declan Freeman-Gleason will be fixing camera and then move to auto. I do not think launcher (or others) people need to come tomorrow (as mechanics will have robot for 3+ hrs)

Riyadth Al-Kazily
Riyadth Al-Kazily7:34 PM

@Lia Johansen Will we be able to test launcher sensors without launcher team members present? I assume there will be some adjustments done to the encoder mounts, and it will be important to check that everything is working correctly.

Binnur Alkazily
Binnur Alkazily9:06 PM

@Lia Johansen if the intention is to shoot more balls than just 10, we need to speed launcher as well — I think we are about 2 balls/sec — launcher team, please verify

Brian Hutchison
Brian Hutchison9:09 PM

We're at around 3/sec

Declan Freeman-Gleason
Declan Freeman-Gleason9:16 PM

@Binnur Alkazily I think that faster shooting is important, but we won't have any more balls to shoot unless we get faster driving. Do you agree? Do you think a launcher person should be there tomorrow?

Riyadth Al-Kazily
Riyadth Al-Kazily9:20 PM

I guess we should measure the rate of the shooter to know for sure. And driving faster is the first step to shooting more. However, if we don't get the shooting rate faster, then we probably don't need to drive faster either. I think if the goal is to increase the number of shots fired in autonomous, then we need both a shooter person and a drivetrain person.

Lia Johansen
Lia Johansen9:32 PM

For auto, the launcher will onlyshoot 10 balls @Brian Hutchison ? So that needs to be fixed

Brian Hutchison
Brian Hutchison9:34 PM

I'll be there tomorrow at 1

Brian Hutchison
Brian Hutchison9:34 PM

I'll fix the ten ball limit before then

Jeremy Lipschutz
Jeremy Lipschutz9:53 PM

i can come at 1 as well, i have tennis at bhs at 2 so that'll be fine

Lia Johansen
Lia Johansen10:19 PM

@Brian Hutchison @Jeremy Lipschutz awesome. Thanks! Just stay as long as u are needed/want to

Ronan Bennett
Ronan Bennett10:20 PM

I can come at one too if needed

Riyadth Al-Kazily
Riyadth Al-Kazily10:56 PM

FYI, I used a stopwatch while watching us shoot on the Auburn Mountainview videos, and I'm pretty sure we're at around 2 balls per second (or slower) currently.

Riyadth Al-Kazily
Riyadth Al-Kazily10:57 PM

Maybe someone can check my math. Here is a good one to measure:
https://youtu.be/-RhM2tzKhg?t=7s

Enrique Chee
Enrique Chee11:22 PM

Thanks Brian, jeremy_lipschutz , and Ronan to show up Sun at 1 pm.

2017-03-19
Paul Vibrans
Paul Vibrans6:39 AM

I timed the video at 2.4 balls per second for balls two through nine. There is a startup delay and an ending delay because the agitator has an empty pocket.

Binnur Alkazily
Binnur Alkazily11:42 AM

@Lia Johansen @Declan Freeman-Gleason the main question is from the aspect of ‘should we stay at the boiler and shoot all balls during 15sec autonomous, or end it at some point to get 5pts to cross the barrier’

Jeremy Lipschutz
Jeremy Lipschutz11:45 AM

@Binnur Alkazily do we not cross the baseline when we open the hoppers?

Lia Johansen
Lia Johansen11:47 AM

I think sos @Jeremy Lipschutz

Lia Johansen
Lia Johansen11:57 AM

@Binnur Alkazily i am not sure. I thought we would cross when we open the hoppers? Im not sure though

Binnur Alkazily
Binnur Alkazily12:01 PM

@Lia Johansen and @Jeremy Lipschutz this strategy is specifically relating to going to the hoppers first, filling up our storage, shoot, and then cross (or not)

Jeremy Lipschutz
Jeremy Lipschutz12:01 PM

@Binnur Alkazily

Jeremy Lipschutz
Jeremy Lipschutz12:01 PM

@Binnur Alkazily I don't think we need to cross after we shoot because we cross when we go to the hoppers

Riyadth Al-Kazily
Riyadth Al-Kazily12:01 PM

Is "crossing the baseline" meaning "on the other side of the baseline at the end of autonomous", or is it "robot crossed the baseline at some time during autonomous"?

Lia Johansen
Lia Johansen12:02 PM

@Jeremy Lipschutz i thought so

Riyadth Al-Kazily
Riyadth Al-Kazily12:02 PM

We should get a rules clarification on that.

Clio Batali
Clio Batali12:03 PM

Just at some point - the robot only has to break the vertical plane of the line during autonomous (give me a second to find the exact rule)

Lia Johansen
Lia Johansen12:03 PM

I wont be at the meeting today btw

Jeremy Lipschutz
Jeremy Lipschutz12:04 PM

For each ROBOT that breaks the BASE LINE
vertical plane with their BUMPER by T=0

Jeremy Lipschutz
Jeremy Lipschutz12:04 PM

that's what i copied from the matchplay.pdf

Binnur Alkazily
Binnur Alkazily12:04 PM

sweet!!

Binnur Alkazily
Binnur Alkazily12:05 PM

then we can just sit at the boiler (unless we want to move towards the gear deliver before auto ends)

Clio Batali
Clio Batali12:05 PM

Section 4-3 - thanks Jeremy

Riyadth Al-Kazily
Riyadth Al-Kazily12:12 PM

If we grab a lot of balls (fill our "cargo hold" - what do we call it?) and come back and start shooting, the field management system (FMS) will probably end the launch command at the end of autonomous. Since we may have more balls to shoot, it could be advantageous to keep the shooter running somehow, while the drivers get to the controls. Otherwise the shooter shuts down and has to be restarted, losing a second or so of shooting.
This is "optimization" territory, and this suggestion may be not worth implementing. And may not even be legal... Might look cool to have a robot keep going at the end of autonomous. :-)

Riyadth Al-Kazily
Riyadth Al-Kazily12:21 PM

To pick a time for the "fuel dump" from the hopper, this video is clear and would provide a good approximation:
https://youtu.be/-RhM2tzKhg?t=54s

Riyadth Al-Kazily
Riyadth Al-Kazily12:22 PM

Looks to me like we'd want to wait 4 seconds after pushing the release bar in order to get all the balls.

Riyadth Al-Kazily
Riyadth Al-Kazily12:24 PM

Of course we can leave earlier, as long as we have enough balls to meet our goal (whatever we set that to be).

Brian Hilst
Brian Hilst12:40 PM

Keep in mind that there is a delay in counting balls in the boiler so don't know how many we can get counted before it's over.

Binnur Alkazily
Binnur Alkazily1:02 PM

slow start to he day... Will be in before 2pm (but after 1pm) FYI

Riyadth Al-Kazily
Riyadth Al-Kazily5:39 PM

FYI, firmware versions between the two robots differ:
Main robot (bagged) versions:
Firmware: 2.1.0f3
Image: FRCroboRIO2017v8

Backup robot (bare chassis) versions:
Firmware: 2.0.0b86
Image: FRC
roboRIO2017v8

Riyadth Al-Kazily
Riyadth Al-Kazily5:40 PM

Recommendation: update the backup robot to the latest version. If the latest version is newer than what is on the bagged robot, we should probably upgrade that one too.

Riyadth Al-Kazily
Riyadth Al-Kazily5:41 PM

(Odd behavior with the network camera on the main robot was fixed by unplugging the RoboRIO from the radio and power cycling the radio. The camera remained "fixed" after re-plugging in the RoboRIO.)

Riyadth Al-Kazily
Riyadth Al-Kazily5:42 PM

No idea if this has anything to do with the firmware, but the problem seems to be related to the main RoboRIO...

Enrique Chee
Enrique Chee6:21 PM

Thanks for update on the camera problem.

Riyadth Al-Kazily
Riyadth Al-Kazily6:33 PM

On Chief Delphi, teams report that the current firmware version (as of February) should be 3.0.0f0

Riyadth Al-Kazily
Riyadth Al-Kazily6:33 PM

Both our RoboRIOs are then out of date.

Riyadth Al-Kazily
Riyadth Al-Kazily6:33 PM

We should update the practice bot and see if there are any major hiccups, and if not, we should update the main bot too.

Paul Vibrans
Paul Vibrans7:42 PM

The video QM19 referenced by Riyadth is interesting because it shows we could have got a lot more points by shifting the shots more to the right. I suspect the drivers could not tell the difference between balls going behind the smoke stack and balls going in. If this is really the problem, is there a way to correct it?

2017-03-20
Dana Batali
Dana Batali10:51 AM

@Timo Lahtinen et al: I found that I could add an existing project to my eclipse workspace following these steps:

1. right-click in the package explorer, select import
2. select an import wizard: General/Existing Projects into Workspace
3. select root directory: (I selected ../workspace/2017-STEAMworks.clean
4. Finish

2017-03-22
Ethan Rininger
Ethan Rininger10:11 PM

@Ethan Rininger has joined the channel

2017-03-25
Declan Freeman-Gleason
Declan Freeman-Gleason8:20 AM

@Lia Johansen 8:30 is calibration and measurement

2017-03-26
Declan Freeman-Gleason
Declan Freeman-Gleason4:37 PM

My priority list in order of importance.

Binnur Alkazily
Binnur Alkazily6:12 PM

Launcher team, once we figure out the launching issues (theory is that the gunk build up on the launcher wheel caused the problems), let's crank back up the agitator speed - goal is to see if we can do 10ball + gear delivery in auto -- speed + accuracy will matter for success

Binnur Alkazily
Binnur Alkazily6:14 PM

@Lia Johansen please make sure to tag any changes from GP competition

Lia Johansen
Lia Johansen6:15 PM

Okay, so ill change the code rpm 1880 to 1800

Lia Johansen
Lia Johansen6:15 PM

Ans then make a tag

Lia Johansen
Lia Johansen6:15 PM

I made a tag last night

Lia Johansen
Lia Johansen6:15 PM

@Binnur Alkazily

Binnur Alkazily
Binnur Alkazily6:16 PM

@Lia Johansen keep everything from this weekend as is and tag -- when shooter is functioning please then update the speeds and validate we still have reliable shooting w/ higher speed.

Binnur Alkazily
Binnur Alkazily6:16 PM

Works?

Binnur Alkazily
Binnur Alkazily6:17 PM

I think we are saying the same thing :)

Lia Johansen
Lia Johansen6:17 PM

I already have a tag i made last night. U want the new rpm 1800? Even if we changed it because of the launcher issue?

Binnur Alkazily
Binnur Alkazily6:25 PM

When they fix the launcher issue, let's see if we can go back to our prior numbers

Paul Vibrans
Paul Vibrans8:12 PM

At Auburn Mountain View we started the tournament with a new shooter wheel. At Glacier Peak we started the tournament with a used shooter wheel that had been cleaned, maybe not enough. Do we have a new wheel that we can put on in Cheney or should we spend the time trying to really clean the old one?

Riyadth Al-Kazily
Riyadth Al-Kazily9:23 PM

I believe that the "overshoot" condition was the result of a second ball being launched before the flywheel had returned to it's set speed (after the first ball is launched). It may be that the system is oscillating now, as it returns to the set speed, and our shot-to-shot time coincides with the time where the flywheel is going too fast. Assuming our system was properly damped before (returning to set speed swiftly without overshoot), then something physical likely changed that affected the balance. If this is the case, then we should be able to re-tune the launcher PID to the new physical state of the machine. However, if we can't determine what changed, then it may change more at Cheney, and result in PID errors again.

Riyadth Al-Kazily
Riyadth Al-Kazily9:27 PM

It is possible that the stuff on the shooter wheel is responsible, but I would have expected a more gradual change in performance as the deposits built up. But I think the overshoot of the PID system is likely due to added resistance on the motor/flywheel (friction?), resulting in either a more severe slowing of the wheel during shots, or the need to apply more voltage to compensate for the slowing (to overcome resistance). Maybe lubrication would help?

Enrique Chee
Enrique Chee9:37 PM

We do have new wheels

Riyadth Al-Kazily
Riyadth Al-Kazily9:41 PM

For autonomous work (ie, auto gear, shoot+gear, hopper+shoot) I'd like to suggest that the team do some work with the backup chassis (ideally with some weight added). We should be able to test repeatability and accuracy with the subsystems and commands we have, and work on improving both. Final tuning of the commands would probably have to wait for Cheney, but if we start tweaking things there we will probably never get it finished in time.

Riyadth Al-Kazily
Riyadth Al-Kazily9:41 PM

Is it possible to meet sometime during the week or on the weekend?

Riyadth Al-Kazily
Riyadth Al-Kazily9:41 PM

Would anyone be interested in working on the problem?

2017-03-27
Lia Johansen
Lia Johansen12:59 PM

@Riyadth Al-Kazily : we are not doing the hopper shoot

Lia Johansen
Lia Johansen1:01 PM

Also we are meeting Wednesday to work on side gear auto

Riyadth Al-Kazily
Riyadth Al-Kazily1:37 PM

Are you going for side gear AND shoot? Or just side gear?

Lia Johansen
Lia Johansen3:37 PM

Just side gear

Lia Johansen
Lia Johansen5:10 PM

@Declan Freeman-Gleason , @Brian Hilst , @Niklas Pruen : There is going to be a meeting this Wednesday from 3 - 6 and programmers will be working with the second chassis working on the side gear autonomous. Launcher people do not need to come as we will not be able to work with the real robot.

2017-03-28
Binnur Alkazily
Binnur Alkazily9:01 AM

@Lia Johansen on wed, I will try to stop by at the end of the day - FYI

Brian Hilst
Brian Hilst4:42 PM

I can be there

2017-03-29
Declan Freeman-Gleason
Declan Freeman-Gleason8:19 AM

I had an idea that could prove interesting, especially if we feel we need to place more gears: Because we have to retune the lcauncher, what if we reangled and retuned the launcher to shoot from the side gear peg? Because we have to sit there anyway when getting a gear pulled up, and we want to go to the side gear peg anyway in autonomous. This could reduce cycle times, although we don't know if the launcher can shoot that far.

Chris Rininger
Chris Rininger8:37 AM

if reangle/retune launcher is an option, then shooting from hopper during auto is another direction to consider - seems like many robots in the 40 kPa club do this & score 20+ fuel in auto

Binnur Alkazily
Binnur Alkazily8:41 AM

@Lia Johansen change of plans - unfortunately I won't be able to make it this evening after all / pls send out a quick update after the meet up. Thanks!

Dana Batali
Dana Batali8:47 AM

i'll be there today

Binnur Alkazily
Binnur Alkazily9:04 AM

Awesome @Dana Batali

Lia Johansen
Lia Johansen9:23 AM

@Declan Freeman-Gleason : interesting idea, but we would have to tune the launcher at cheney. My worry with that is it might not work and could mess everything up.

Chris Rininger
Chris Rininger9:36 AM

FYI - other teams are making some bets on changes like this in prep for district champs and worlds: https://www.chiefdelphi.com/forums/showthread.php?t=157194

Riyadth Al-Kazily
Riyadth Al-Kazily11:12 AM

There is talk over in the #climber channel about changing the gear ratios on the climber motor (to get a faster climb). Right now the software uses percent vbus for the climber motor (which is the correct control mode to use), and it sets 90% for "fast" climb, and 45% (half of the fast) for the "slow" climb. I think we should re-consider using a ratio to select the slow speed, and instead should pick an actual percentage value. With a faster gear ratio, we probably need to reduce the slow speed below half anyway, because if we turn too fast we may not catch the rope.

Riyadth Al-Kazily
Riyadth Al-Kazily11:14 AM

Also, with the desire for more speed, I suggest boosting the percentage for fast climb to 100. If we increase the torque on the motor through the change in gear ratio, we are more likely to stall the motor, so giving it 100% will be more reliable than any fraction of that.

Riyadth Al-Kazily
Riyadth Al-Kazily11:15 AM

And do we report the climber motor current to the drivers on the driver station? It could be good for them to see if current is getting too high, which could mean that the motor has stalled. This really is only useful if they can raise the motor set point to compensate for the stall condition.

Riyadth Al-Kazily
Riyadth Al-Kazily11:15 AM

(And I also cannot be at the meeting today...) :-(

Paul Vibrans
Paul Vibrans2:08 PM

Does the motor controller try to deliver a constant output voltage, a constant output current, or a constant duty cycle? I have always assumed it was voltage.

Riyadth Al-Kazily
Riyadth Al-Kazily4:52 PM

It is basically a constant output voltage, however it accomplishes this by using a pulse-width modulated signal that switches the output on and off at a high rate (15.625kHz, according to the manual). The pulse duty cycle allows a proportional voltage to appear on the motor terminals.

Jack Stratton
Jack Stratton11:39 PM

if it turns out that you want to use auto recordings, the buttons on the dashboard will work fine until you want to shoot. to shoot, make a recording, then do these steps on a linux computer:
`ssh lvuser@roborio-4915-frc.local`
`cd Recordings`
`mv "<latest recording>" "<new name>"`
`exit`
`scp lvuser@roborio-4915-frc.local:Recordings/"<new name>" /tmp/x`
`head -n1 /tmp/x | tr ',' '\n'`, paste output in a text editor with line numbers (or add `| nl` to the command)
find a part with a bunch of zeroes that looks like when you've parked the robot
`ssh lvuser@roborio-4915-frc.local ; cd Recordings`
`vi "<new name>"`, add the line number of the right point to the third line of the file (it's only two lines right now, so add a third with `G` then `o` then start typing)

2017-03-30
Dana Batali
Dana Batali7:33 AM

@Jack Stratton - so far I follow you, the only question I have is what to type to signify shooting?

Dana Batali
Dana Batali7:38 AM

@Jack Stratton, @Timo Lahtinen : Jack - can you point us at the ruby script for transposing the csv files for scouting? I was thinking it would be a good idea to convert it to python and check it into the spartronics repo somewhere. Perhaps we need a new (tiny) repo called 2017-Scouting?

Dana Batali
Dana Batali8:55 AM

@Declan Freeman-Gleason : could you summarize for this channel what you learned about at Glacier Peak regarding IP-camera/roborio flakiness?

Jack Stratton
Jack Stratton9:58 AM

dana_batali: the line number of the middle zero of a given cluster of zeroes in the tr output goes on the third line of the real file on the robot. here's the script, I already gave it to ethan and jon; jon has already used it successfully on the computer that will be doing the scouting at the next events. https://gist.github.com/phroa/dcf7ec5e5007bd7d715542167f1e04fc

Dana Batali
Dana Batali10:03 AM

@Jack Stratton -

- so the signal to shoot is a 3rd line, whose value is the index into the second line where shooting is to begin?

- fyi @Jon Coonan asked @Timo Lahtinen to install ruby on his laptop which led to this question... Rather than add a new language to the mix, I suggested that it would be easy to convert your ruby script to python

Jack Stratton
Jack Stratton10:04 AM

dana_batali: yes, and interesting... so much for the scouting computer in the marketing box

Dana Batali
Dana Batali10:05 AM

@Jack Stratton - can you point me to an example input csv file?

Dana Batali
Dana Batali10:07 AM

(i wasn't aware of a scouting computer or how it relates to marketing, nor even why @Jon Coonan requested programming assist, probably just a backup plan...)

Jon Coonan
Jon Coonan10:08 AM

Yeah so the deal with needing timo do install ruby is that he is heading scouting at Cheney and worlds. I can use the marketing laptop which has ruby on it to compile the scouting data and then we won't need Timo's computer I just figured he might want to install it on his personal machine as a backup. If that creates issues for programming we can just use the pit computer

Jack Stratton
Jack Stratton11:12 AM

oh, @danabatali, I never specified how to run it: `ruby mergespeedscout_17.rb path/to/folder/*.csv > out.csv`

2017-04-01
Riyadth Al-Kazily
Riyadth Al-Kazily9:40 AM

Happy Arduino Day, especially to the Bling team: https://day.arduino.cc/

Riyadth Al-Kazily
Riyadth Al-Kazily12:22 PM

FYI, Binnur and I cannot make it to Cheney next week. Too many things going on at work for both of us.

Riyadth Al-Kazily
Riyadth Al-Kazily12:22 PM

We will monitor Slack if there are questions about debugging. Let us know if we can help remotely.

2017-04-05
Brian Hilst
Brian Hilst8:55 PM

@Declan Freeman-Gleason @Lia Johansen What time do you expect to begin testing the new autonomous strategies? Niklas and are planning to come over for that.

Lia Johansen
Lia Johansen8:57 PM

@Brian Hilst : we plan on starting testing at 9:30 am. We already tested one of the side gears and it was successful. You definitely do not need to come at 9:30 am. We can keep u updated

Brian Hilst
Brian Hilst8:59 PM

Ok. Thanks!

2017-04-08
Chris Rininger
Chris Rininger9:52 PM

I wasn't sure whether to share here or Random. This is something Bear Metal threw together using field & robot CADs + the Unity game engine + some programming. They showed it to me while I was browsing the pits. I guess a couple team programmers built it Sunday- Wednesday this week. Anyway, pretty cool - could be a way to create a driving practice sim that anyone on the team could play with. The Bear Metal folks were very nice and enjoyed sharing it - they might even share the source & how to integrate another robot CAD if asked (perhaps post-season).

2017-04-09
Paul Vibrans
Paul Vibrans11:37 AM

I just watched QF3-2 at McMaster University and one robot shot 10 balls in auto with one obvious miss and still got 10 kPa before teleop. What can we do about inconsistent ball counting?

Paul Vibrans
Paul Vibrans11:39 AM

I wouldn't be surprised if some fields consistently count low.

Paul Vibrans
Paul Vibrans11:41 AM

Or did I see a team that was able to sneak an 11th ball into their hopper?

Riyadth Al-Kazily
Riyadth Al-Kazily11:44 AM

I think the trick is to catch the miss as it falls off the boiler, and shoot it again :-)

Paul Vibrans
Paul Vibrans11:57 AM

The miss I saw stayed on the top of the boiler in the net. All other shots went in and no hoppers were dumped in auto. There was only one shooter.

Paul Vibrans
Paul Vibrans11:58 AM

I wonder if we could get alliance mates to shoot into our hopper as we make our turn toward the boiler.

2017-04-20
Brian Hilst
Brian Hilst10:46 AM

The Worlds matches have started. We are currently 1:1. Our remaining matches today are scheduled for 12:18pm & 1:24pm. They are roughly on schedule.

Here is a link to our schedule: https://www.thebluealliance.com/team/4915/2017

You can watch live at: https://atthecontrol.com/dashboard/home/HOPPER/4915

If they post videos for prior matches, they should be at: https://www.thebluealliance.com/team/4915/2017#videos

There are 6 matches tomorrow, starting at 6:00 AM

2017-05-21
Brian Hutchison
Brian Hutchison5:59 PM

These two videos helped me to understand PID control and I think that they would be a good way to explain PID to beginners for next year

2017-05-22
Binnur Alkazily
Binnur Alkazily1:26 PM

cool - programming leads, I suggest either starting to build a resources list on our github, and/or pin these items to this channel so they don't get lost

James Slattery
James Slattery1:29 PM

@James Slattery pinned a message to this channel.

James Slattery
James Slattery1:29 PM

@James Slattery pinned a message to this channel.

2017-06-03
Chris Rininger
Chris Rininger11:03 AM

A couple fellow Microsoftees who mentor our perennial neighbors and friends, 4911, reached out to me with an opportunity to share and help extend their scouting platform. I believe they also worked with 4663 on it. Anyway, the solution involves inexpensive Android tablets (they use the cheapest Kindle Fires) in the stands + a Windows laptop as a server, with connectivity over Bluetooth. Rose expressed interest, but after receiving more information, what would be required is A) a mentor with who can help with setting up the end-to-end solution set up and B) at least one programmer student who can participate in one of the three areas of student contribution:
1. Android App – design and UI programming for web/mobile development inclined students
2. Data movement - pushing data from app to server for students who are inclined for "systems" work
3. Tableau / Data Analysis - for statistics / machine learning inclined students

In a response to this message, I will attach a PDF of a more detailed email from one of the mentors. Please let me know if there is interest so I can get back to them. Thanks!

Ronan Bennett
Ronan Bennett11:54 AM

@Chris Rininger I'm interested in helping out, but don't have any experience in JSON, SQL, databases etc. I'd have to learn as I went along, which I'd definitely be willing to do.

Chris Rininger
Chris Rininger12:06 PM

I recall interest in cultivating Tableau skills on the team, and I suspect it would be a quicker area to ramp up on than the multi-layer Data Movement area - maybe Tableau would be an area for someone to jump in with 4911? Get the Tableau tool and do online training to start perhaps?

Chris Rininger
Chris Rininger12:10 PM

@Ronan Bennett It would be really great useful stuff to learn, but there will be a learning curve. Let's see if others are also interested & in what areas, and then we can have a dialog with Anne and Johan from 4911 to see if students need to have deep skills coming in or if they can be less experienced.

Ronan Bennett
Ronan Bennett12:11 PM

@Chris Rininger Ok, sounds good.

Declan Freeman-Gleason
Declan Freeman-Gleason12:13 PM

@Chris Rininger I would definitely be interested in contributing

Declan Freeman-Gleason
Declan Freeman-Gleason12:14 PM

For anyone else who wants more information, https://github.com/frc4911 is their GitHub.

2017-06-05
James Slattery
James Slattery1:08 PM

How come the Spartronics Git Organization does not have any members public so they have the org on their personal profile? I think it would be a cool way to show others that you are a part of the team.

Riyadth Al-Kazily
Riyadth Al-Kazily2:09 PM

Good point! It turns out each contributor gets to set themselves as "public" or "private", with the default being private. I just made myself public (because I do want the Spartronics logo on my Github profile page).

Riyadth Al-Kazily
Riyadth Al-Kazily2:10 PM

To make the change, go to the Spartronics Github page, click on "People" tab, and then click on your name. In the box on the left side of the page will be a selector to choose public vs. private.

James Slattery
James Slattery2:11 PM

Ah, alright. Could someone with access to merge these do so: https://github.com/Spartronics4915/developershandbook/pulls would be greatly appreciated :thumbsup::skin-tone-2:

Binnur Alkazily
Binnur Alkazily4:19 PM

I set a reminder for myself to do it tonight, unless someone gets to it faster ^^ @Jack Stratton @Declan Freeman-Gleason @Lia Johansen -- (with this said, I don't recall the permissions either :slightlysmilingface: )

Chris Rininger
Chris Rininger9:59 PM

I received some more information about the scouting app collaboration opportunity from the other Microsoft 4911 mentor, Anne. So I tacked the info on to what Johan already provided. Technology Summary

Chris Rininger
Chris Rininger10:06 PM

Tech summary for the 4911 scouting app collaboration opportunity:
1. Android programming (Java) for the app and Bluetooth client
2. C# .NET programming for the server
3. Tableau for the analytics

So far, I've heard interest from @Rose Bandrowski (not very interested in programming - maybe Tableau?), @Ronan Bennett (Java/Android maybe - any Android experience), and @Declan Freeman-Gleason (maybe the end-to-end solution set up? or Java/Android). Rose, there was mention of graphic design, but the catch is there is a little bit of programmy work that goes with it. I recommend reading the updated PDF above.

Any mentors have .NET/C# knowledge that could help with that part?

Rose Bandrowski
Rose Bandrowski10:07 PM

@Rose Bandrowski has joined the channel

Rose Bandrowski
Rose Bandrowski10:09 PM

If I'm needed for graphics that is fine. I've taken ap computer science (java), know html quite well, and a little C++ and javascript. I just hope that people more interested in programming than me take it on.

Chris Rininger
Chris Rininger10:12 PM

I have a few questions for Anne and Johan on the opportunity, like time commitment, when the collaboration will take place, how it will be done (e.g. video calls + Slack). If others have questions, please reply to this post, and then I can send them one bundle of questions from us.

Jack Stratton
Jack Stratton10:12 PM

I'm moderately familiar with android, much more with java in general. I'd be open to working with C# but I'd have to see if I can set up some kind of Mono dev environment

2017-06-22
Chris Rininger
Chris Rininger12:47 AM

Java-based vision solution from stronghold... could be worth checking out: https://www.chiefdelphi.com/forums/showthread.php?threadid=142173

Enrique Chee
Enrique Chee11:18 AM

Thanks Chris . Programmers check it out .

2017-07-17
Chris Rininger
Chris Rininger11:08 AM

I have some questions about how controls have been set up to work the past couple years... For Helios, with the Xbox controller, is it set up so the two joysticks on the controller are used, one controlling throttle of the left side & the other controlling throttle of the right? And if you want to turn while driving, you give one side more throttle than the other (i.e. like a differential/tank drive)? Or were throttle and steering separate, and the software figured out the differential? I'm similarly curious about how the flight stick controller works both with Helios and with Ares. Throttle and steering combined, supported by software I assume. Thanks

Riyadth Al-Kazily
Riyadth Al-Kazily2:30 PM

We have used "arcade drive" on our robots so far. That uses a single joystick to control both forward/reverse and steering (having the software take the x/y input from the joystick and figure out the relative speeds of the motors). This is a module provided by FIRST, and is fairly easy to implement. Our only modification to the standard usage (as far as I know) is to implement a "throttle" control that sets the maximum speed of the motors, allowing for more control of the robot in certain applications. Oh, and we also implemented a "reverse" mode, where the front of the robot becomes the back, from the driver/joystick point of view.

Riyadth Al-Kazily
Riyadth Al-Kazily2:33 PM

"Tank drive" is an alternative control scheme, also provided as a module by FIRST. That allows two joysticks to independently control the left and right motors. It is not to be confused with tank "treads" vs. wheels (unfortunately it is often confused in general conversation...). We never used that mode, due to driver preference (or due to the lack of trying it...)

Chris Rininger
Chris Rininger4:16 PM

Thanks, Riyadth!

2017-07-19
Chris Rininger
Chris Rininger7:39 AM

Follow-up on the above... After doing some reading, it looks like the "single stick" arcade drive is commonly used with the flight stick style controllers. With Xbox & similar controllers, it seems, a more common approach may be to control left/right axis with the right side mini-joystick and forward/backward axis (i.e. throttle) with the left side mini-joystick. Some also seem to like adding on extenders (commonly used by FPS gamers) to the mini joysticks for added travel/precision. With tank drive, as one might assume, teams use two identical joysticks, ranging in size from controllers with two side-by-side mini joysticks (PS2-like?) up to two full-size flight sticks.

2017-07-29
Peter Hall
Peter Hall10:41 AM

@Peter Hall has joined the channel

2017-07-30
Chris Rininger
Chris Rininger10:11 AM

Hi, I was talking with Peter and Samantha during the BARN session yesterday about potential projects & raised the idea again about custom operator controls like these: https://1drv.ms/f/s!AikCDwtdoW5Lqj66386jgdCOtXj. One project could be to build a proof of concept for Helios's controls in the preseason... I looked at the operator flight stick, and the labels are below - would one of you please confirm what each operation is that is not marked understood? Thanks...
FLIGHT STICK HELIOS CONTROLS
Lower left: on and off (intake I assume?)
Lower center - left: reverse (not sure - also intake?)
Lower center - right: slow (intake?)
Lower right: climb and off (understood)
Stick - left: one shot (understood)
Stick - center: launch (understood - both hopper & shooter, right?)
Stick - right: unjam (understood - hopper)
Stick - lower center: stop (understood - hopper/shooter)
Trigger do anything?

2017-08-03
Declan Freeman-Gleason
Declan Freeman-Gleason12:12 PM

If you have the time, I highly recommend this video: https://youtu.be/8319J1BEHwM
It covers a lot of information on complex autonomous, mostly on motion planning, trajectory computing and following, but also a little on vision.

2017-08-05
Chris Rininger
Chris Rininger6:11 PM

For a while at robotics, Samantha and I discussed the custom operator controls idea further. Here's a concept that came out of that...

2017-09-08
Peter Hall
Peter Hall2:07 PM

Looks great

Peter Hall
Peter Hall2:08 PM

That would be a great preseason project or even something that we could work on during build season

2017-09-12
Chris Rininger
Chris Rininger7:27 PM

Related to controls, there are numerous interesting drive team capabilities teams have utilized discussed in this thread. Lots of possibilities... https://www.chiefdelphi.com/forums/showthread.php?t=158337&highlight=Fpv

Paul Vibrans
Paul Vibrans8:55 PM

For what it is worth, the US Coast Guard requires the gauges in their helicopters to be mounted on the instrument panel so that the pointers point straight up when the indicated parameter is at the correct operating value. At a glance, a pilot can tell if something is not right and to some extent how big the problem is.

2017-09-17
Chris Rininger
Chris Rininger11:30 AM

Looking around at what kinds of sensors teams use along with programming to aid navigation & driving. There are a couple intriguing navigation IMUs (inertial measuring unit, I think), and I'm wondering if we've ever taken a look at them. They are: A) NavX MXP and B) CTRE/Gadgeteer Pigeon. These units bundle multiple sensors together, and as far as I can tell, teams with swerve drives often use these to constantly track orientation and thus simplify steering for drivers. It seems like there should be other uses, even without using swerve. Both are on the AndyMark controls parts page: https://www.andymark.com/Controls-s/262.htm Thanks

Paul Vibrans
Paul Vibrans4:18 PM

AndyMark is also selling an Analog Devices Gyroboard, a single axis rate gyro. The description says one of these is in the 2017 Kit of Parts so we should have one somewhere.

2017-09-19
Terry Shields
Terry Shields11:58 AM

Ha! This is awesome. I'm working with a FIRST LEGO League rookie team and they are just discovering the gyro sensor that comes in the LEGO robot kit. They have already learned how valuable it is --- and how it has some drawbacks that programmers have to compensate for (namely, calibration and lag). As a side note on the LEGO gyro device, sometime over the last couple of years LEGO changed the gyro sensor without any announcement. It appears the newer gyros are now dual-axis but LEGO has not fully utilized the dual axis functionally yet. However, how you calibrate the old vs. the new requires different programming!

2017-09-22
Dana Batali
Dana Batali3:32 PM

@Chris Rininger: we have been using a 9 degree of freedom imu for the last couple years. This is how we know we're driving straight or turning a specific angle (as for autonomous)

2017-09-29
Randy Groves
Randy Groves8:33 PM

@Randy Groves has joined the channel

2017-10-03
Mike Rosen
Mike Rosen5:29 PM

Wait. What? Last year’s robot had an IMU to help it drive straight? Learn something new every day!

I don’t remember anyone working on that.

Declan Freeman-Gleason
Declan Freeman-Gleason5:55 PM

@Mike Rosen Nicklaus worked on it. There was actually a fair amount of drift that needed to be corrected for; it wasn't full-on PID, but it seemed to work. https://github.com/Spartronics4915/2017-STEAMworks/blob/238800b69e78f706c21e4f4900687bd4fe5e3eb3/src/org/usfirst/frc/team4915/steamworks/commands/DriveStraightCommand.java#L130

Mike Rosen
Mike Rosen6:37 PM

Nice. I get it: keep going in the direction you were originally pointed. I guess you that without the sustained running the pid parameters aren’t that critical

2017-10-07
Kenneth Wiersema
Kenneth Wiersema6:22 PM

Kenneth Wiersema
Kenneth Wiersema6:23 PM

Here's Cad file for Synthesis of Helios, if anyone's interested. I tried to set it up to the ports you guys assigned to the robot, but message me if there's problems with the file, and whether the file works to begin with. I did run into some odd things with just driving it with the software.

Kenneth Wiersema
Kenneth Wiersema6:24 PM

No bumpers and I don't think the intake will work, but that's as far as I know

Declan Freeman-Gleason
Declan Freeman-Gleason6:27 PM

@Kenneth Wiersema Thanks Kennith! Accurate or not, I think that having a working simulation can be beneficial for testing process, especially in regard to non-accuracy sensitive bugs.

Kenneth Wiersema
Kenneth Wiersema6:28 PM

You're welcome

Chris Rininger
Chris Rininger8:07 PM

If you go to the Synthesis forum, the dev team will help you out if you encounter problems. Also, Sotabots has uploaded their past couple robots to Synthesis, and you could probably preemptively ask someone on their programming team if they have learned anything that makes the process easier. Good luck! I think it amazing they are planning to have the 2018 game field available by the end of kickoff day.

2017-10-08
Declan Freeman-Gleason
Declan Freeman-Gleason1:31 PM

@Chris Rininger I know you've have a lot of interest in different control configurations, so I modified the 2017 codebase to allow on-the-fly detailed customization of controls so you and the drivers can find what works best. I used that as an opportunity to try out the Synthesis code emulator, and I've gotten it working enough to test my modifications and fix a few bugs in the new code. There are a number of things that the emulator doesn't support (yet?), most notably CAN Talon, that I had to convert into a bunch of dummy code to get it to run. It did work with our web dashboard, which in this situation was mostly what I needed for testing. Hopefully we can get a chance to test the new stuff out on a real robot so you can let me know what you think. The on-the-fly customization probably won't be useful once we get around to competition season, but I do think that it could help our new drive team in the preseason a bit. (Screenshot is of the dashboard connected to an emulated robot, running the new controller code.)

Chris Rininger
Chris Rininger1:59 PM

@Declan Freeman-Gleason love the configurability...
Standard Xbox controller config is left joystick = forward/backward and right joystick = left/right. To do just that, would the config just be...
Rotation (i.e. left/right): +/- sqrt(RJOYX)
Forward: +/- LJOYY
...and then the triggers could be used as dampeners for slower, more precise movement?

Thanks for doing this!

Declan Freeman-Gleason
Declan Freeman-Gleason2:04 PM

@Chris Rininger Yeah, that config is exactly right! If we wanted dampening we would just multiply by the trigger values. (E.g. LJOYY*LTRIG)

Chris Rininger
Chris Rininger2:11 PM

@Declan Freeman-Gleason so cool! This will help the drivers optimize controls much faster than if code mod were required each config change. Only other question is... are there varying value ranges to be aware of for joysticks vs. triggers (e.g. joysticks are 0-255 in each dimension but triggers are 0-99 or something like that)? I also heard the resolution of the Xbox Elite controllers is higher than the standard ones, but I suspect that may not be reflected in the actual numeric values.

Declan Freeman-Gleason
Declan Freeman-Gleason2:30 PM

@Chris Rininger I'm pretty sure that all the values for the triggers and joysticks are between -1 and 1, but if they aren't consistent it should be an easy fix in the code. Although the Xbox Elite controller is probably more accurate, I know it has the same precision as a regular Xbox controller.

Chris Rininger
Chris Rininger2:49 PM

@Declan Freeman-Gleason ok, thanks. so with <1 absolute values, the sqrt function results in a fairly aggressive response curve if I'm thinking about it right. And a slower-than-linear response curve could be achieved using the pow([input#],[power argument]) function, remembering to add the (-1) back to the function for going backward or left. Is only sqrt covered in the code, or can other functions like pow, log, exp be used?

Declan Freeman-Gleason
Declan Freeman-Gleason6:49 PM

@Chris Rininger There's currently sin, cos, tan, sqrt, log, exp, and the ^ operator (instead of pow). It's really easy to add anything else if we need it.

Enrique Chee
Enrique Chee7:52 PM

Thanks Declan and Chris.

2017-10-09
Chris Rininger
Chris Rininger10:10 AM

One thing that came to mind on the ferry this morning is the opportunity to use a y=n+f(x) form for the controller response in order to use the full range of physical travel of the joystick or trigger, where n=the minimum number >0 for the robot to move. You could find n by simply plugging in constants, starting with a low number like .01 and increasing it until the robot moves when the control is pressed. And from there find a f(x) that will result in the desired response curve where y = 1 = (n + f(x=1)). You can play around with graphing tools like the one below as well to find possible functions to use. In this example, the curve from the squaring function was too gentle and the curve from tan() function was too aggressive, so I averaged the two. Anyway, this is something to potentially a try during controls tuning. Plus it is fun example of applying these math functions to something real. Here's the example: http://www.mathsisfun.com/data/grapher-equation.html?func1=y%3D.2%20%2B%20((x-.115)%5E2%20%2B%20tan(.67x))%2F2&xmin=-1.450&xmax=1.450&ymin=-1.088&ymax=1.088

Riyadth Al-Kazily
Riyadth Al-Kazily3:03 PM

@Chris Rininger I believe this is similar to how we have used a "throttle" control on our full-sized joysticks. There is a small potentiometer paddle that we used to set a scaling factor for the primary joystick x and y axes. That way, the driver could set the scaling factor dynamically.

Chris Rininger
Chris Rininger5:55 PM

@Riyadth Al-Kazily Yep, similar. Declan and I were pondering using the xbox controller joysticks as the main controls (LJOYY=forward/backward, RJOYX=left/right) and then using the paddles as dampeners for finer control when needed. Something like... Forward speed = n + (f(LJOYY) (1-(LTRIG0.8))) maybe, where n = # >0 needed for slowest possible speed, f(LJOYY) is the main response curve, and the (1-(LTRIG*0.8)) factor results in dampening from 0% to 80% depending on how much the trigger is pressed.

2017-10-11
Vogl_Madeline
Vogl_Madeline9:34 PM

@Vogl_Madeline has joined the channel

Adam Rideout Redeker
Adam Rideout Redeker10:19 PM

@Adam Rideout Redeker has joined the channel

2017-10-12
Mike Rosen
Mike Rosen11:01 AM

Geek Humor: https://twitter.com/chronum/status/540437976103550976

A connection to make it more interesting. I saw this only b/c John Carmack retweeted this. Carmack is the guy who did ID software (Quake, Wolfenstein, Doom) and now Oculus VR.

Mike Rosen
Mike Rosen11:13 AM

About the humor: You need to click on the link to see what the picture is describing: "Multithreading in Theory and Practice." Really funny.

But actually, that was just what came to mind while I was thinking about Binnur's remark about "Herding Cats." This really is a big thing in the software industry. Years ago, I used to work at EDS (in Poulsbo, who knew?). Anyhow, this giant systems integrator decided they need to spend tons of money on a SuperBowl commercial. This is the result: https://www.youtube.com/watch?v=mMaJDK3VNE

Binnur Alkazily
Binnur Alkazily11:19 AM

^^ oh, yea -- that pretty much sums up my day job!! just had to course correct w/ my dev team on top priorities. #daily_challenge

Justice James
Justice James4:27 PM

@Justice James has joined the channel

Darwin Clark
Darwin Clark4:53 PM

@Darwin Clark has joined the channel

Ryan Olney
Ryan Olney5:01 PM

@Ryan Olney has joined the channel

Ulysses Glanzrock
Ulysses Glanzrock6:50 PM

@Ulysses Glanzrock has joined the channel

2017-10-13
Austin Smith
Austin Smith8:46 AM

@Austin Smith has joined the channel

Willie Barcott
Willie Barcott1:45 PM

@Willie Barcott has joined the channel

2017-10-14
Josh Goguen
Josh Goguen1:39 PM

@Josh Goguen has joined the channel

Charlie Standridge
Charlie Standridge5:49 PM

@Charlie Standridge has joined the channel

2017-10-15
Declan Freeman-Gleason
Declan Freeman-Gleason2:26 PM

@Declan Freeman-Gleason pinned a message to this channel.

Darwin Clark
Darwin Clark4:28 PM

Hi all, this is a presentation that I created about what I did over the summer in regards to the vision platform. I would be more than happy to see a few comments or questions. Thanks!

Darwin Clark
Darwin Clark4:28 PM

Cory_Houser
Cory_Houser6:19 PM

@Cory_Houser has joined the channel

Binnur Alkazily
Binnur Alkazily6:32 PM

programming team from last year — any tricks to IMU reset?? Aside from not being 100% sure about what red vs. blue side robot positioning, it didn’t look like our robot was turning correctly in autonomous — @Declan Freeman-Gleason please sync w/ rose to make sure we have that straighten out next week. thanks!

Declan Freeman-Gleason
Declan Freeman-Gleason6:48 PM

@Binnur Alkazily The BNO055 doesn't provide functionality to zero itself on the fly, so it's position when you get an instance of it (it's a singleton) for the first time is the zero. We get an instance in the constructor of `Drivetrain`, which is constructed in `robotInit`. For @Rose Bandrowski that means that the robot has to be in its final position when you turn it on, if you want autonomous turning to be accurate.

Darwin Clark
Darwin Clark6:59 PM

When you say 'reset', do you mean reset all the values to 0?(X,Y,Z)? I remember having some functionality like that last year in FTC.

Cruz Strom
Cruz Strom7:00 PM

@Cruz Strom has joined the channel

Declan Freeman-Gleason
Declan Freeman-Gleason7:07 PM

@Darwin Clark When I said the BNO055 doesn't provide the functionality, I actually meant that the code we use to access it doesn't expose the functionality... That means we could add it, but we just didn't have time during the build season.

Binnur Alkazily
Binnur Alkazily7:20 PM

Hmm. - that is what I thought, but felt maybe we got red/blue swapped or something weird — basically did opposite of expected

Binnur Alkazily
Binnur Alkazily7:20 PM

We’ll make sure it works next Sunday - and ready for autonomous -

Chris Rininger
Chris Rininger8:46 PM

Wish list item for the drive team: lower latency video feed from the robot this year. I did some searching around, and the 7th post in this thread seems promising. Would someone please take a look to see if it is an approach we could use? Here's the link: https://www.chiefdelphi.com/forums/showthread.php?t=156781&highlight=camera+lag+ms And I'll copy in the post for convenience...

This year was the first we really found a need for a first person view from the robot for our driver in order to locate gears on the other side of the field. So we did some research into how to get the best quality back inside 2MB with decent latency.

The only way to get anything we deemed reasonable inside the 2MB was to use h.264 encoding. The Rio just isn't capable of doing this while running robot code and keep any latency (lag) out of the system. The answer was in off the shelf security cameras. Most of them have 720p (20fps) output over RTSP with h.264 encoding. This worked well, but we dropped the resolution down to VGA (still at 20fps) to drop the latency down to just a few ms.

This is the specific camera we used: http://www.microcenter.com/product/4...amerawithPoE
It was a pretty simple job to take it apart and make a smaller enclosure so it fit better.

(thanks -Chrisrin)

Declan Freeman-Gleason
Declan Freeman-Gleason9:18 PM

I think that's definitely something we should look into... We could also try doing h.264 encoding on a Raspberry Pi or Jetson, which would work nicely with any vision system also running there.

Declan Freeman-Gleason
Declan Freeman-Gleason9:21 PM

Both the Jetson and the Raspberry Pi apparently have hardware-accelerated h.264, but getting that to work might be quite a can of worms.

Mike Rosen
Mike Rosen9:46 PM

@Darwin Clark , your exposure to OpenCV is very impressive. My quick look over this suggests you were able to use the Python interface to recognize rectangular features but stopped short of actually (a) identifying the targets of interest or (b) putting them in a reference frame that includes our robot (so we could steer toward it, shoot at it). Is that right? I'd love to hear more. Do you have thoughts on how to address these?

Declan Freeman-Gleason
Declan Freeman-Gleason9:51 PM

@Darwin Clark What version of OpenCV is your code written for? Also, nice work :slightlysmilingface:

Darwin Clark
Darwin Clark9:53 PM

Identifying targets of interest will vary a lot based on the game. In the event that the game involves rectangular reflective tape, it should be a breeze (such as in Stronghold). My current working idea in regard to driving to a target is to mark the field of view by physically testing it on the robot (watch the live camera view and move an object until it shows up in the camera view)

Darwin Clark
Darwin Clark9:53 PM

Something like this:

Darwin Clark
Darwin Clark9:53 PM

Its a wee bit tedious, and I'm open to other ideas

Darwin Clark
Darwin Clark9:54 PM

Chris Rininger
Chris Rininger10:17 PM

Since folks are talking about vision, I thought I would re-share this little arm+gpu+cam solution that started on kickstarter and is now in full production. It's relatively cheap - I'm thinking of getting one to play with and possibly mount on the little track drive robot my son Lucas and I started building this summer. https://www.jevoisinc.com/

Darwin Clark
Darwin Clark10:38 PM

That seems pretty interesting regarding neural net. I really wanted to integrate that into my vision platform, but never ended up finding how. Do you know if the example code is online somewhere @Chris Rininger ?

Chris Rininger
Chris Rininger10:48 PM

maybe here? http://jevois.org/basedoc/groupdarknetprof.html I have to be honest; I am interested in the capabilities, but the programming itself is (at this point) beyond me.

Dana Batali
Dana Batali10:49 PM

@Darwin Clark: nice presentation and great that you've developed some real experience here. The python+opencv+jetson is exactly the approach we followed two years ago, so it might be useful to peruse the github from that year. As we briefly discussed, there are significant challenges keeping co-processors running during a match (separate power requirements, extra networking, physical stability of the mount point, more tool chains to keep up-to-date, more software deployment issues, etc). If our vision solution really requires extra compute power and we have time to properly design its location on the robot, then this is still a good way to go. If you look at the repo from a couple years back (stronghold, i think) you'll see that we used the python binding of network tables to communication our vision results back to the roborio. We also had a streaming web server running on the jetson that could be viewed from the driver station.

Other approaches that are available:

- there is now an opencv option (with java bindings) running on the roborio... The only disadvantage of this approach is that is consumes resources from the roborio and one would need to perform careful analysis of the final vision algorithms running on the cpu with other software running in game configuration to ensure that we're within the capabilities of roborio. Chief Delphi has some proponents of this approach, there are also opponents.

- last year we purchased a more canned solution in the form of CMU cam: this something similar to what @Chris Rininger refers to above: namely it has both camera and processor bundled into a single package. The advantage of the CMU cam (aka pixy) is that it's simple algorithm is guaranteed to run at ~50hz and required no real "vision algo dev" on our parts. The downside is that it only recognizes blobs with a strong hue component. In my experience, this may not actually be such a bad limitation. That said, we didn't explore this option in depth last year because vision was determined by captains to be low on the priority list - and less important for our gameplay strategies (where we'd find known-good shooting positions manually).

- i recall perusing the other option mentioned above (jevois) and since it wasn't available and didn't seem to offer significant advantages over our jetson solution, it wasn't pursued further.

Dana Batali
Dana Batali10:51 PM

Regarding video latency:
- there are two issues here: frame rate and latency. I believe that the latency issue is really the challenging problem, more than the encoding. That is: we did get reasonable frame rates, it just that they were from 1/2 a second prior which is really tough for drivers.

Regarding ip cameras:
- we did follow this approach last year: we bought cheap IP camera which we tweaked both resolution and framerate. I don't recall if it supported h.264, but found that the "lab experiments" we conducted were always more optimistic than the real field experience where they enforce bandwidth limiting, etc.
- the camera referenced above refers to RTSP, which I'm not familiar with, so this might be worth delving into more about. (https://developer.mozilla.org/en-US/Apps/Fundamentals/Audioandvideodelivery/Livestreamingwebaudioandvideo)

Other tidbits:
- last year we also experimented with a usb cam and a server on the roborio. We found that the roborio could only serve a singled camera and ended up with two cameras: one an IP camera for the forward direction and the usb camera was used for the back view (or vice-versa).

- setting up the video server on the roborio wasn't trivial, but I believe @Riyadth Al-Kazily crossed all the tees on that and it did end up working reasonably well (except the latency problem).

- for streaming jpeg encoding, we found that chrome browser performed terribly. To this end, we adopted Firefox for the driverstation and it was way better.

- some cameras offer wide field of views. We have one of these but it wasn't as plug-and-play, since the resolutions it supported weren't standard. We purchased a weird macro lens (for iphones for $10) and glued it onto the ip camera. This worked surprisingly well. It does underscore the point that what a driver might need from a camera might be at odds with what a vision system might need. (it's harder to invert spherical projections than perspective projections and this might make vision a little more challenging).

Chris Rininger
Chris Rininger10:52 PM

I thought I remembered you saying we used the IP camera - I shared that thread because the approach to reducing the latency seemed worth investigating - the 640x480 @ 20fps and just a few ms lag sounded good for the drivers - thanks

2017-10-16
Darwin Clark
Darwin Clark8:42 AM

@Dana Batali In regard to computation power, when I was testing I watched the CPU as well as RAM usage of the Jetson. The CPU seemed to flat line around 70%, with only one or two gigs of ram being used. In short, I'm skeptical of moving to a different platform for lack of computing power, but I don't think its entirely out of the question.

Dana Batali
Dana Batali9:38 AM

@Darwin Clark: i agree that jetson likely offers us the most compute power and that non-trivial vision probably can't be done on the roborio for lack of resources. There is a middle road used by several teams: beaglebone or raspberry pi. These don't have quite the power of the jetson, but consume fewer watts. Not to be taken lightly: the acquisition rate of any vision solution. The usb (2) bus can't deliver 30 fps at DVD quality. Next-gen vision solutions often offer a fast-path from camera to the compute engine. The jetson TX1 has this (and we have one of these, we could explore). In stronghold, we found that we could only get around 10fps from our jetson plus usbcam (even at lower resolution) and this has implications for how to integrate vision target acquisition into the control loop. The CMU camera has the nice property of delivering targets at 50 fps. At this higher frame rate, it's possible to integrate vision targeting in the inner control loop of the robot. But at 10 fps, we would need to have a slug-speed robot to ensure that the closed-loop feedback doesn't cause significant oscillations. We did some bizarre (unintentional) robot dances during the stronghold development. To achieve fast targeting with a slow vision target acquisition rate, the standard approach is to identify the target first (usually as an angular offset to the current heading), then use the imu and pid controller to get to the target quickly.

Anyway it's great that you have both the interest and the skill set to look into these questions. I'll be happy to help!

Dana Batali
Dana Batali9:46 AM

@Chris Rininger: thanks for the pointer to the camera thread. Reading beyond the initial post you reference I learned that there are a few caveats with this particular approach:

- they used the custom app to display the image on the driver station. ie: they didn't (couldn't?) integrate it with a typical dashboard. This approach may be viable if and only if the different windows can be layed out so as to not interfere with one another. Clearly it would be preferable to view the video in our dashboard context (and this may be possible since we have a browser-based dashboard).

- also: the h.264 requires non-trivial computations both to encode but more importantly to decode. This means that this format may not be easy to integrate into an efficient vision solution. In the referenced thread, there is some discussion of all this by the ILAMtitan poster.

Chris Rininger
Chris Rininger10:43 AM

@Dana Batali thanks for giving it a look. One question comes to mind: would it be appropriate to use the IP camera & h.264 for just the driver video feed and a different camera / solution for vision. As far as displaying the video, I'm going to recommend a second monitor on the refreshed drive station, so the laptop monitor for dashboard & a 2nd one for robot video stream (or vice verse). I think with driver using the xbox controller & moving around to get sight lines, a larger dedicated image (ideally with low latency) is more likely to be used. It'll take some doing, but I believe it is possible.

Dana Batali
Dana Batali10:51 AM

a definite possibility. Pretty much the same as we did last year (two cameras running through the roborio subnet), i wonder whether the two monitor solution might be overkill (and make for a very bulky driver station to carry around). As an aside, have you identified a student to help carry the torch you are holding? I have some small concerns that we're treading into the mentor-led, rather-than student-led territory here.

Dana Batali
Dana Batali11:00 AM

one other point: it's great that you are focused on the driver experience and the key risk associated with camera feeds is how well it will perform on the field. To ensure that all the lab-dev-work isn't wasted it may be wise to look into simulating the network conditions associated with a real match: multiple robots contending for bandwidth via a shared router with Qos scheduling going on.

Darwin Clark
Darwin Clark11:43 AM

@Dana Batali Because I was spending time on testing and delivering I had not thought about FPS or how that may affect output w/ the Rio. It seems that getting 10 values per second for the targets would be fine (the camera was operating at 10FPS when I was testing). It would depend on how fast networkTables works(I have no idea how it works). To actually be a limitation. What FPS was the camera running at two years ago? @Dana Batali

Dana Batali
Dana Batali11:47 AM

@Darwin Clark: as i mentioned above, we got 10ish FPS through the python/opencv/jetson system. The usb camera on the jetson was able to deliver nearly 30 fps (at vga-res) with no processing iirc. 10fps for processed results can definitely be made to work, just not in the inside of the control loop... I can perhaps better motivate this point in person.

Chris Rininger
Chris Rininger1:00 PM

@Dana Batali Yes, Jon C was just recently confirmed as drive coach, and we'll be transitioning things to him, though I'll still be in an advisory mentor role to help make sure there is awareness of options/opportunities. I've talked with Peter already about the prospect of designing a new drive station, including custom controls - haven't talked about 2nd monitor thing - agree we may not need & there are trade-offs.

Mike Rosen
Mike Rosen1:35 PM

Hey gang, Declan sent out an email on the 14th asking everyone to get a development environment set up. Consider this a gentle reminder. If last year was any guide, it is distinctly non-trivial ... and many of us ended up doing it several times -- so beyond just following the instructions, try to acquire at least a nodding familiarity with what all we're installing and why.

Those of you running Linux, might get lucky and have it 'just work.' If you're not so lucky, I found (Debian Stretch) that installing "libwebkitgtk-1.0.0" fixes an unsatisfied dependency and allows the plugin wizards (File/New Project/Robot Example Project) to run without crashing.

Declan Freeman-Gleason
Declan Freeman-Gleason1:54 PM

Please see @Mike Rosen's previous message.

Darwin Clark
Darwin Clark1:54 PM

@Declan Freeman-Gleason Just based on release dates it looks like I would be using 3.2.0.

Declan Freeman-Gleason
Declan Freeman-Gleason1:58 PM

@Darwin Clark Ok... Are there any Jetson specific assumptions in the code?

Darwin Clark
Darwin Clark2:09 PM

@Declan Freeman-Gleason Not that I remember.

2017-10-17
Dana Batali
Dana Batali11:17 AM

here is the primary script we used to perform vision - it allowed us to experiment with a number of different approaches during dev.. Once we settled on one of the algorithms, we created a bootstrapping mechanism that would automatically run this script in the preferred mode..

https://github.com/Spartronics4915/2016-Stronghold/blob/master/src/org/usfirst/frc/team4915/stronghold/vision/jetson/imgExplore2/imgExplore.py

Mark Tarlton
Mark Tarlton7:08 PM

@Mark Tarlton has joined the channel

Darwin Clark
Darwin Clark9:44 PM

Fantastico, I'll probably end up looking at this around Thursday if I get my IDE set up properly. Still jumping through hoops with Ubuntu.

2017-10-22
Binnur Alkazily
Binnur Alkazily3:23 PM

@Declan Freeman-Gleason all ok on autonomous. However, any tricks for reverse camera? It is not showing anything.

Declan Freeman-Gleason
Declan Freeman-Gleason3:25 PM

That's the network camera... Try going to 10.49.15.13/video.cgi

Declan Freeman-Gleason
Declan Freeman-Gleason3:26 PM

The username should be `admin` and the password should be blank.

Declan Freeman-Gleason
Declan Freeman-Gleason3:27 PM

If that doesn't work then it's an issue with the actual camera, if it does, then it's probably an issue with the dashboard (I would guess that the login process is being weird?).

Riyadth Al-Kazily
Riyadth Al-Kazily3:28 PM

I cannot access the camera URL you provide (but can access the roborio at 10.49.15.2). The IP address of the camera does not respond to pings. (But the green light on the camera is illuminated)

Riyadth Al-Kazily
Riyadth Al-Kazily3:29 PM

I will check the Ethernet cable next...

Riyadth Al-Kazily
Riyadth Al-Kazily3:34 PM

Checked the Ethernet cable. It is secure at both ends, but still no communication with the camera.

Declan Freeman-Gleason
Declan Freeman-Gleason3:36 PM

Hmm...

Riyadth Al-Kazily
Riyadth Al-Kazily3:37 PM

I'm thinking of trying a new Ethernet cable. The camera is using one of those thin, flat ones... I don't know if I trust those...

Declan Freeman-Gleason
Declan Freeman-Gleason3:37 PM

You could try plugging it into your computer.

Riyadth Al-Kazily
Riyadth Al-Kazily3:38 PM

True. I will do that when they stop driving.

Riyadth Al-Kazily
Riyadth Al-Kazily3:38 PM

The IP address is static on the camera, right? No DHCP?

Declan Freeman-Gleason
Declan Freeman-Gleason3:41 PM

I would think that it would be set to static, otherwise it would have an issue with the radio on the robot (that has DHCP disabled).

Riyadth Al-Kazily
Riyadth Al-Kazily4:09 PM

We unplugged the camera at the radio and I plugged in to my computer. It worked fine. We plugged it back in to the robot radio, and it worked fine on the driver station. So it was probably a loose connection on the Ethernet at the radio. It's all good now. Thanks @Declan Freeman-Gleason for your support.

Declan Freeman-Gleason
Declan Freeman-Gleason4:18 PM

Sounds good

Enrique Chee
Enrique Chee4:59 PM

Thanks guys !

Binnur Alkazily
Binnur Alkazily5:22 PM

@Declan Freeman-Gleason thank you for remote support! :)

Rose Bandrowski
Rose Bandrowski8:34 PM

@Binnur Alkazily since we fixed the camera, do I need to ask Declan to come in on Tuesday after school still?

Binnur Alkazily
Binnur Alkazily8:35 PM

probably not — though, we didn’t really fix it… we just replugged it couple times :confused:

Rose Bandrowski
Rose Bandrowski8:36 PM

Sorry, terminology :sweat_smile:

Binnur Alkazily
Binnur Alkazily8:37 PM

no worries — it is the default cntr-alt-delete action of windows :slightlysmilingface:

2017-10-23
Declan Freeman-Gleason
Declan Freeman-Gleason2:25 PM

@Emma Lahtinen @Cory_Houser @Charlie Standridge @Josh Goguen @Willie Barcott @Austin Smith @Ulysses Glanzrock @Ryan Olney @Darwin Clark @Justice James @Adam Rideout Redeker @Vogl_Madeline Just a friendly reminder to try setting up your computer using the instructions in the email I sent to you. If you can't get something to work, we'll be there to help on Wednesday, but we would really appreciate if you try all the steps on your own. Also remember to set your Slack profile picture to a good photo of yourself!

Emma Lahtinen
Emma Lahtinen2:25 PM

@Emma Lahtinen has joined the channel

Ryan Olney
Ryan Olney2:26 PM

I don't have a portable computer that I can bring back and forth, so what should I do about that?

Declan Freeman-Gleason
Declan Freeman-Gleason2:28 PM

@Ryan Olney It sounds like you should borrow a team computer, which means that you don't need to do anything right now but setup Slack (which you've already done).

Ryan Olney
Ryan Olney2:29 PM

Ok thx

Enrique Chee
Enrique Chee6:13 PM

Ryan still setup at home so you can program at home .

Enrique Chee
Enrique Chee6:15 PM

You can borrow a laptop during the meeting but not take it home .

Declan Freeman-Gleason
Declan Freeman-Gleason6:30 PM

@Enrique Chee @Ryan Olney That's a good point, thank you Mr. Chee.

Ryan Olney
Ryan Olney7:19 PM

Ok will do

Ulysses Glanzrock
Ulysses Glanzrock8:54 PM

My laptop has been acting up and going through various updates and is just not working right now, so can I use a teem computer for this next meeting?

Ronan Bennett
Ronan Bennett9:05 PM

@Ulysses Glanzrock Yes you can use a a team computer during the meeting, but you should also bring your own laptop in case we are able to help with the setup

2017-10-25
Josh Goguen
Josh Goguen3:27 PM

The meeting is at 6:15 today, right?

Josh Goguen
Josh Goguen3:28 PM

Wait I got it, never mind

Cory_Houser
Cory_Houser4:29 PM

My Mac is having some problems downloading the Eclipse software does anyone have any tips for me or should I just check it at the meeting today?

Ronan Bennett
Ronan Bennett4:46 PM

@Cory_Houser Unless someone else has Mac tips, we'll just sort it out at the meeting

Johan coondog
Johan coondog6:57 PM

@Johan coondog has joined the channel

Riyadth Al-Kazily
Riyadth Al-Kazily7:40 PM

Notes on how to shut down Mr. Chee's Samsung laptop if it won't shut off: https://superuser.com/questions/290132/how-to-force-power-off-of-a-samsung-series-9-laptop

Riyadth Al-Kazily
Riyadth Al-Kazily7:40 PM

Short story, stick a paperclip in the hole in the middle of the back.

Riyadth Al-Kazily
Riyadth Al-Kazily7:42 PM

Then you have to plug it in before it will boot again.

Terry Shields
Terry Shields7:44 PM

What? No voodoo spells required?

Anna Banyas
Anna Banyas10:34 PM

@Anna Banyas has joined the channel

2017-10-26
Martin Vroom
Martin Vroom10:45 AM

@Martin Vroom has joined the channel

2017-10-28
Binnur Alkazily
Binnur Alkazily7:48 AM

@Declan Freeman-Gleason do you have drivers for the usb Ethernet?

Dana Batali
Dana Batali11:28 AM

very interesting news for FRC 2018 wrt to programming:

1. new game data api (opens possibilities for better automated scouting)

2. python is now acknowledged as an FRC language

3. a new BLDC (brushless) motor controller and supporting s/w

details here: https://www.firstinspires.org/robotics/frc/blog/2018-beta-teams-brushless-game-specific-data

2017-10-29
Mike Rosen
Mike Rosen4:41 PM

I want to share a conversation I had with team 4450 who implemented computer vision: in autonomous mode, their robot looks for the two shiny tape blocks on either side of the peg (for the gear) and uses that to drive the robot so the gear lands on the peg.

It works like this. They install Grip, the WPILib tool that FRC provides for simple CV integration, on a laptop -- totally unconnected to any robot anything. They point the laptop's webcam at the targets (shiny tape) and fuss with the dials until it recognizes the shapes. Then Grip dumps out a Java class -- a .java file -- which derives from wpilib.vision.pipeline. They put this class into their robot code. It has two APIs we care about: process() and getTheArrayOfTargetShapes(). They plug a web cam into the Roborio's USB port. While the robot is driving they have a loop that looks like this:


while True:
img = webcam.readCurrentImage()
pipeline.process(img)
Array<Images> a = pipeline.getTargetImages()
// make sure there are two images, find the center of each
// mark the midpoint between the two images. That's where the peg is.
// calculate the offset between the peg and the center of the image
// insert that offset into the drivetrain so we steer toward that.



I obviously am playing fast and loose with the details but he walked me through firing up Grip and using the generated code in the Robot. I was impressed with how straightforward the whole thing seemed. He said it worked pretty well.

Riyadth Al-Kazily
Riyadth Al-Kazily6:52 PM

FYI, that's Olympia Robotics Federation (http://orf4450.org/), and the code for their robot is on GitHub here: https://github.com/ORF-4450/Robot10

Declan Freeman-Gleason
Declan Freeman-Gleason8:00 PM

After playing around with Grip and Java/Python from scratch, Grip feels like a hassle to me. You have to know what processing steps you want, whether or not you're using Grip, and there isn't much actual complexity in the vision code that you would write by hand. Any of the complexity is in the logic that comes after the processing, but Grip didn't really do a lot in that department. To me it seems like the challenge is just tuning and robot integration, which you really need a robot with a correctly positioned camera and consistent lighting for anyway. I do think that running the vision code on the RoboRio, written in Java, is very appealing. I'm not a big fan of weakly typed or interpreted languages, especially when testing comes at a premium. An actual objective advantage of this is the elimination of a coprocessor/separated codebase. @Mike Rosen Did you ask them about how to RoboRio handled the workload?

Mike Rosen
Mike Rosen8:08 PM

I specifically asked about the impact of the image processing on the RoboRio: "we have concerns that its very processor-intensive and the processor may already be pretty busy"... "might be... but we didn't notice anything."

Declan Freeman-Gleason
Declan Freeman-Gleason8:10 PM

That's interesting... They appear to have integrated it into a control loop, but I never really noticed how well their vision worked though. Control code here: https://github.com/ORF-4450/Robot10/blob/master/src/Team4450/Robot10/Autonomous.java

Riyadth Al-Kazily
Riyadth Al-Kazily10:06 PM

Do we have any scouting data we can refer to, regarding their autonomous gear delivery?

Declan Freeman-Gleason
Declan Freeman-Gleason10:40 PM

I can't seem to find any by searching Slack... Am I missing something, or is the data actually missing?

2017-10-30
Riyadth Al-Kazily
Riyadth Al-Kazily11:12 AM

You may need to follow up with Kenneth. Maybe it is not exported to somewhere searchable yet.

Randy Groves
Randy Groves11:20 AM

boot

Randy Groves
Randy Groves11:21 AM

AAAH! To many keyboards!

Darwin Clark
Darwin Clark11:41 AM

@Declan Freeman-Gleason Mike and I talked about this as well. My viewpoint on GRIP in general was that it worked very well from a planning point of view. I used it when I still didn't know the tools avalable from OpevCV. Everything was neatly organised for me. It was a good graphic organiser, but the capabilities deploying code with it are unknown to me.

Dana Batali
Dana Batali12:11 PM

two questions come to mind:

1. did it actually work
2. did they talk about how many visiion targets per second they achieved? If that number isn't high, then they would have to move very slow to prevent oscillations.

next point: a peg-offset coupled with a distance is generally insufficient to deliver successfully. It presumes that the approach angle is approximately perpendicular to the peg. (which can be "guaranteed" either during autonomous or by the driver)

finally: if i understand it, GRIP is just a way to avoid coding and use a simpler (graphical) interface to describe an imaging pipeline, right?

Declan Freeman-Gleason
Declan Freeman-Gleason1:53 PM

@Kenneth Wiersema Do you have the scouting data from Girls Gen?

Kenneth Wiersema
Kenneth Wiersema4:10 PM

Here’s my condensed data, 4450 had a auto gear average of 0.8 for the first 30 matches, so 80% of the time sucessful https://docs.google.com/spreadsheets/d/1K3-izMjgS3O6rewK3mSjOlz2b35jTIvL5bNsliZoek

Kenneth Wiersema
Kenneth Wiersema4:10 PM

Declan Freeman-Gleason
Declan Freeman-Gleason5:30 PM

Thanks Kennith.

Declan Freeman-Gleason
Declan Freeman-Gleason5:30 PM

Those number aren't too bad.

Declan Freeman-Gleason
Declan Freeman-Gleason5:34 PM

@Dana Batali What experience do we have using a Raspberry Pi? Do you think that there's any reason to even consider the Jetson?

Chris Rininger
Chris Rininger11:17 PM

CD thread on that little camera I mentioned a while back - check 7th post... https://www.chiefdelphi.com/forums/showthread.php?threadid=159883

2017-10-31
Dana Batali
Dana Batali9:18 AM

i would guestimate that a jetson tk1 still offers factors more of raw performance, so a team that knows how to leverage that power would have an advantage. That said, I attended some talks at worlds and discussed the topic of neural-net-based vision with a very-on-the-ball student and he said: it may not be worth the trouble, for the simple problems usually presented in FRC. So I would venture to guess that the pi3 is the better solution because:

1. it runs on 5V - and thus won't require extra buck-boost equipment. We should be able to power it directly off the PDP. (requires max of 2A, so we'd need to verify that).

2. there is a high-speed camera bus with widely available camera solutions that will allow us to bypass USB issues and increase the framerate. Raspberry pi has a bigger dev community than jetson (albeit more enthusiasts and fewer experts).

3. as for experience: I have some, i would guess that other mentors do. I have a rasberry pi 2 that I'd be happy to donate, but since the raspberry pi 3 are only $40-$45, I would guess it's better just to fork over the dough for those. I would be happy to donate a couple of those to the team, if there is consensus that we should go that way.

Perhaps the next step is for student leaders to make a determination on this question... An important consideration: are there sufficient programming resources to dedicate to this task. (i believe the answer is that we'd only need @Darwin Clark with oversight from @Declan Freeman-Gleason, so resources may not be an issue).

Terry Shields
Terry Shields9:20 AM

Also see post #8 (just posted this morning) in the link that Chris provided above. Provides even more insight into the camera and preliminary programming. It's worth following asid61 as he continues to evaluate.

Darwin Clark
Darwin Clark9:38 AM

@Dana Batali Resources (specifically manpower) Isn't an issue. I think I'm basically going to be focusing on vision the entire year (So long as the game allows it). The process right now is discussing the platform, which leads us back to your statement, student leadership needs to make a decision.

Dana Batali
Dana Batali1:34 PM

re chief delphi thread: https://www.chiefdelphi.com/forums/showthread.php?threadid=159883

this unit seems comparable to raspberry pi 3 - it may actually be a bit cheaper ($50 includes camera)...

Hardware specs for the jevois:
https://www.jevoisinc.com/pages/hardware

https://cdn.shopify.com/s/files/1/1719/3183/files/comparison-to-rpi1024x1024.png?v=1488568740


Seems like it has less RAM but a faster clock speed.

From an opencv point of view, the unit is quite comparable to a pi3 it just appears to have lots more canned solutions. This will advantage teams who want plug-n-play vision. My predilection is that students learn the nuts-and-bolts of things, rather than learn the skill of good shopping for off-the-shelf solutions. For this reason, I think I'd lean toward the pi3, but there's no doubt that the jevois seems comparable and viable.

Other differences:

pi3 has wire, wifi, bluetooth networking, but jevois has only usb. It's not yet clear to me how jevois would signal the "host" . It appears to have a custom serial terminal interface, so we'd need to write a custom serial client app to poll for results and communicate these to robot. Or perhaps we'd simply integrate the "terminal" into the robot code. Contrast this with network table-based communication using pynetworktables.

Dana Batali
Dana Batali1:37 PM

we have a vision thread going on under the raspberry pi3 thread heading. There I comment a little on my reading of jevois. 2cent summary: it could work, but it may not be a better choice than raspberrypi 3.

Dana Batali
Dana Batali1:40 PM

one other consideration for choice of vision platform - if we want nvidia sponsorship, @Jon Coonan indicates that we need to make a stronger commitment to delivering solutions atop their platform.

So to summarize we have 5+ vision platform options:

1. roborio
2. jetson tk1 or tx1
3. raspberry pi 3
4. cmucam (pixycam)
5. jevois

Chris Rininger
Chris Rininger2:54 PM

One approach that could help with deciding is the use of a weighted attribute analysis like on slide 9 of this presentation... http://files.andymark.com/Designing-a-FRC-Robot-a-Team-Approach.pdf Identify factors, weight them by importance, and then score each option. Just the exercise of writing down the factors can help a lot. I think ability to iterate quickly when tuning the solution would be up there on the list. Mountability also fairly important depending on the game.

2017-11-04
Declan Freeman-Gleason
Declan Freeman-Gleason10:18 AM

@Emma Lahtinen @Cory_Houser @Charlie Standridge @Josh Goguen @Willie Barcott @Austin Smith @Ulysses Glanzrock @Ryan Olney @Darwin Clark @Justice James @Adam Rideout Redeker We need everyone to try and fully setup their computer by next meeting (Wednesday, November 8). This isn't a straightforward task, so we want you to ask questions on Slack when you need help. Here is a link to the setup steps: https://docs.google.com/document/d/11R18MPRdWPKES-BYCrJC7TPFS6ihdAW2kNvuGPusedE/edit?usp=sharing

Declan Freeman-Gleason
Declan Freeman-Gleason10:18 AM

2017-11-05
Dana Batali
Dana Batali11:27 AM

@Declan Freeman-Gleason, @Darwin Clark here is some "required reading" for video stream processing on raspberry pi:

https://picamera.readthedocs.io/en/release-1.13/fov.html

2017-11-08
Darwin Clark
Darwin Clark8:15 PM

For those dealing with the environment variable issue (JAVA_HOME set improperly), this was my fix:

Darwin Clark
Darwin Clark8:18 PM

I navigated into C:\Program Files\Java and found two folders, JDK and JRE. I then went into the system environment variables UI and added a new system varible titled 'JAVAHOME' pointing at C:\Program Files\Java\jdk1.8.0151, restarted Eclipse a few times, and then it worked fine.

Darwin Clark
Darwin Clark8:18 PM

If asking me can clear me up, go ahead and ask me.

Declan Freeman-Gleason
Declan Freeman-Gleason8:29 PM

This could be a Java 9 issue, because I did exactly the same thing with Java 9 installs.

Darwin Clark
Darwin Clark8:36 PM

And I assumed it did not work

Declan Freeman-Gleason
Declan Freeman-Gleason8:36 PM

You assume correct

Binnur Alkazily
Binnur Alkazily10:59 PM

:slightlysmilingface: are the instructions leading folks to install java9? If so, do they need to install java8 in order to correctly update the java_home environment?

2017-11-09
Randy Groves
Randy Groves4:25 AM

This worked for me. I had previously installed Java 9, but backed off of that, reinstalling Java SDK and then Eclipse. Just attempted to do a build, and got a JAVAHOME error. Setting JAVAHOME in the environment variables to point to the JDK location rather than the JRE location was the ticket. I've got jdk1.8.0_151 installed - from the dev kit .exe off of the Oracle Java site. I had no luck with the oomph installer.

Dana Batali
Dana Batali10:40 AM

@Randy Groves can you confirm that you built an example robot successfully? (at 4:25am? whoa... :-O)

Darwin Clark
Darwin Clark10:45 AM

Randy's description follows what my fix was. I am NOT leading people to install java 9. My fix worked fine with java 8. I remember leaving the meeting yesterday and someone(I think it was Binnur) explicitly saying NOT to install java 9.

Dana Batali
Dana Batali10:52 AM

absolutely agree that we need jdk 1.8.X... It's confusingly possible to install java 9 yet identify jdk 1.8 as the build dependency. Since this point is bound to confuse, it may be best to simply follow last year's procedure exactly.

Randy Groves
Randy Groves10:58 AM

Yes, I did build successfully.

Darwin Clark
Darwin Clark11:00 AM

Someone who has access to the presentation (into to programming) should make a note about this fix on the 'exercise1' slide.

Dana Batali
Dana Batali11:25 AM

this thread: https://stackoverflow.com/questions/1288343/how-to-change-java-home-for-eclipse-ant is one of the problems we ran into last night... Even with the correct java version installed, there can be problems.

Dana Batali
Dana Batali11:29 AM

here's what success would look like if you are compiling from home (and don't have access to the robot):

Dana Batali
Dana Batali11:29 AM

Darwin Clark
Darwin Clark11:32 AM

What does JRE even stand for?

Dana Batali
Dana Batali11:33 AM

java runtime environment (all the support libraries, etc) compared to the java compiler

Dana Batali
Dana Batali11:49 AM

Problem with Getting Started Template

Project 'Getting Started' is missing required library: 'C:\Users\danab\wpilib\java\current\lib\wpiutil.jar

So the recommendation is to build the command-based robot via File->New->Other-> WPILib Robot Development -> Robot Java Project -> Command-Based Robot.

Dana Batali
Dana Batali1:42 PM

For those who are still grappling with eclipse installation, I just updated the slides to augment declan's original instructions. New material starts on slide 40

2017-11-11
Randy Groves
Randy Groves5:29 PM

Do we have access to SolidWorks?

Dana Batali
Dana Batali5:34 PM

this is probably a question for the cad_team channel. The short answer (when I last looked into this) is a free educational license is available through first. Longer answers involve focusing the team on a single cad package that run on mac and windows...

Paul Vibrans
Paul Vibrans7:01 PM

FIRST has agreements with Autodesk, PTC, and Solid Works. That seems to mean free downloads of student versions of their 3D CAD software. For Autodesk that means Fusion 360, on which the team standardized last year.

2017-11-12
Randy Groves
Randy Groves10:02 AM

@Riyadth Al-Kazily Moving from the CAD channel - That was going to be one of my next questions - is a license key available for Driver Station so that we can play at home?

Binnur Alkazily
Binnur Alkazily10:45 AM

I think that may have been documented somewhere in the github wiki or at least on the https://wpilib.screenstepslive.com/s. The license key is from NI and somewhere w/ other license packages that Chee has, I believe.

Enrique Chee
Enrique Chee11:41 AM

It is with programming . If not , then in the robotics room with the electronic stuff . I will search this week .

Declan Freeman-Gleason
Declan Freeman-Gleason12:38 PM

@Randy Groves It turns out that you can run the driver station in evaluation mode, see the following link for details: http://wpilib.screenstepslive.com/s/4485/m/13503/l/599670-installing-the-frc-2017-update-suite-all-languages

Declan Freeman-Gleason
Declan Freeman-Gleason3:02 PM

@Randy Groves Success! I have the robot code actually running TeleOp/Autonomous in the emulator... Unfortunately, I don't think there is a way to get any of this to the actual physics simulator they provide, but there is certainly a class of errors that we can catch with this level of simulation! You can also see that I have the dashboard/NetworkTables connected as well. This reminds me that you have to set up a looback adapter to successfully connect the dashboard or the NI software. See https://www.youtube.com/watch?v=xJw4EcfFiJY

2017-11-14
Enrique Chee
Enrique Chee4:16 PM

The license key from NI is in the robotics room.

Declan Freeman-Gleason
Declan Freeman-Gleason6:14 PM

@Enrique Chee Thank you Mr. Chee. Hopefully we can just use "Evaluation Mode" on everything but the driver station (it doesn't require the key). I plan to write down the key and post it tomorrow on Slack, just in case.

2017-11-16
Binnur Alkazily
Binnur Alkazily8:33 PM

@Charlie Standridge for Saturday lets make sure you have a well working computer w/ Eclipse. Given the strange slowness of your computer, take a look at Window’s admin tools for disk utilities, do virus scan, etc. And, we can always set you up w/ a computer at school!

2017-11-17
Declan Freeman-Gleason
Declan Freeman-Gleason6:04 PM

@Emma Lahtinen @Cory_Houser @Charlie Standridge @Josh Goguen @Willie Barcott @Austin Smith @Ulysses Glanzrock @Ryan Olney @Darwin Clark @Justice James @Adam Rideout Redeker I gave incorrect instructions that asked you to download Java Development Kit 9, which caused issues last meeting. For this reason we need you to download JDK 8 from the below link by tomorrow. You don't need to do anything else. Download link: http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html

2017-11-18
Binnur Alkazily
Binnur Alkazily9:21 AM

running late. Should be in by 10am. FYI

2017-11-21
Dana Batali
Dana Batali7:07 PM

@Declan Freeman-Gleason @Darwin Clark @Riyadth Al-Kazily - to get to the bottom of the performance difference between jetson and raspi, here's a simple c-file that computes a standard floating-point benchmark... It would be a great exercise to get the numbers on the raspi 2, 3 and jetson... I can contribute 2 and 3, @Darwin Clark: can you do the jetson? If you have time in the next several days (not thursday), let me know if you have any questions:

http://www.netlib.org/benchmark/linpackc.new

Dana Batali
Dana Batali7:12 PM

as an example: here's the output for the default array size (200): on an old imac (i expect the numbers to be slower/lower on raspberry pi and jetson):


LINPACK benchmark, Single precision.
Machine precision: 6 digits.
Array size 200 X 200.
Average rolled and unrolled performance:

Reps Time(s) DGEFA DGESL OVERHEAD KFLOPS
----------------------------------------------------
128 0.57 83.38% 2.40% 14.22% 360647.344
256 1.14 83.46% 2.39% 14.15% 358241.625
512 2.27 83.41% 2.40% 14.20% 361488.062
1024 4.54 83.45% 2.39% 14.16% 361188.000
2048 9.04 83.42% 2.40% 14.18% 362719.188
4096 18.09 83.41% 2.41% 14.18% 362354.906

Dana Batali
Dana Batali7:13 PM

KFLOPS is what we're after

Dana Batali
Dana Batali7:14 PM

and here is the trivial compilation:

`gcc linkapackc.new -o linkapack`

Dana Batali
Dana Batali7:17 PM

two other points:

1. we want to edit this file to focus on single-precision performance. This means we want line 21 of the c file to read

`#define SP`

2. optimization matters.. Here's a build line that produces better results on mac:

`gcc -O3 linkapackc.new -o linkapack`


Reps Time(s) DGEFA DGESL OVERHEAD KFLOPS
----------------------------------------------------
512 0.55 84.73% 2.83% 12.44% 1461630.750
1024 1.10 84.66% 2.83% 12.51% 1462711.125
2048 2.21 84.70% 2.82% 12.48% 1454971.250
4096 4.40 84.72% 2.82% 12.46% 1462078.750
8192 8.84 84.72% 2.82% 12.46% 1454345.500
16384 17.69 84.71% 2.83% 12.46% 1453299.750

Darwin Clark
Darwin Clark9:02 PM

Alright Dana, you got it. I'll have the numbers by Monday.

2017-11-26
Darwin Clark
Darwin Clark12:54 PM

@Dana Batali I spent roughly an hour on this, trying to jump over comilation errors. Whenever I compiled the file (I tried three different ways) It would spit out a ton of errors looking like this:

Darwin Clark
Darwin Clark12:54 PM

linepack.c:1:6: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘attribute’ before ‘.’ token
NPACK.C Linpack benchmark, calculates FLOPS.
^
In file included from /usr/include/stdio.h:74:0,
from linepack.c:23:
/usr/include/libio.h:306:3: error: unknown type name ‘sizet’
size
t pad5;
^
/usr/include/libio.h:310:67: error: ‘sizet’ undeclared here (not in a function)
char
unused2[15 sizeof (int) - 4 sizeof (void ) - sizeof (sizet)];
^
/usr/include/libio.h:338:62: error: expected declaration specifiers or ‘...’ before ‘size
t’
typedef ssizet ioreadfn (void
cookie, char buf, sizet nbytes);
^
/usr/include/libio.h:347:6: error: expected declaration specifiers or ‘...’ before ‘sizet’
size
t n);
^
/usr/include/libio.h:469:19: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘attribute’ before ‘IOsgetn’
extern IOsizet IOsgetn (IOFILE
, void , IOsizet);
^
In file included from linepack.c:23:0:
/usr/include/stdio.h:319:35: error: expected declaration specifiers or ‘...’ before ‘sizet’
extern FILE
fmemopen (void s, sizet len, const char modes)
^
/usr/include/stdio.h:325:47: error: expected declaration specifiers or ‘...’ before ‘sizet’
extern FILE open
memstream (char bufloc, sizet sizeloc) THROW wur;
^
/usr/include/stdio.h:337:20: error: expected declaration specifiers or ‘...’ before ‘size
t’
int modes, sizet n) THROW;
^
/usr/include/stdio.h:344:10: error: expected declaration specifiers or ‘...’ before ‘size
t’
sizet size) THROW;
^
/usr/include/stdio.h:386:44: error: expected declaration specifiers or ‘...’ before ‘size
t’
extern int snprintf (char restrict s, sizet maxlen,
^
/usr/include/stdio.h:390:45: error: expected declaration specifiers or ‘...’ before ‘size
t’
extern int vsnprintf (char
restrict s, sizet _maxlen,
^

Darwin Clark
Darwin Clark12:55 PM

That is a shortened version, because the entire error message would be way too long. The three methods of compiling I used were:

Darwin Clark
Darwin Clark12:56 PM

gcc linkapackc.new -o linkapack,
cc -O -o linpack linpack.c -lm
and simply, gcc linpack.c

Darwin Clark
Darwin Clark12:57 PM

I'm a little thrown off by the file types, becuase on the link you gave (http://www.netlib.org/benchmark/linpackc.new) the linkpack is saved as a C file, and on the command YOU gave, linkpack is saved as a .new file.

Darwin Clark
Darwin Clark12:57 PM

Got any suggestions?

Dana Batali
Dana Batali1:25 PM

mysterious - can you rename the file to end in .c?

`mv linpackc.new linpack.c`

Dana Batali
Dana Batali1:51 PM

just got some numbers on my jetson tk1... I'll post them presently. Perhaps you can compare my steps:

1. i used wget to obtain the file via:

`wget http://www.netlib.org/benchmark/linpackc.new`

1a.rename file:
`mv linpackc.new linpacknew.c`

2. edit linpacknew.c, change `#define DP`to `#define SP` on line 31.

3. compile the file:
`gcc -O3 linpacknew.c -o linpacknew`

4. run the file:
`./linpacknew`

Darwin Clark
Darwin Clark2:09 PM

Ahh I see, I added #define SP, but other than that I followed the same steps

Darwin Clark
Darwin Clark2:09 PM

I'll try those steps once I get back home

Dana Batali
Dana Batali2:11 PM

here are numbers i got on my jetson:


Reps Time(s) DGEFA DGESL OVERHEAD KFLOPS
----------------------------------------------------
512 0.75 86.68% 3.00% 10.32% 1042013.250
1024 1.50 86.66% 3.00% 10.34% 1042433.500
2048 3.01 86.66% 3.00% 10.33% 1041903.188
4096 6.02 86.66% 3.00% 10.33% 1041714.125
8192 12.14 86.50% 3.14% 10.36% 1033971.625

Dana Batali
Dana Batali2:22 PM

here are numbers from a raspi3:


LINPACK benchmark, Single precision.
Machine precision: 6 digits.
Array size 200 X 200.
Average rolled and unrolled performance:

Reps Time(s) DGEFA DGESL OVERHEAD KFLOPS
----------------------------------------------------
128 0.99 89.64% 2.84% 7.52% 191614.531
256 1.98 89.65% 2.83% 7.52% 191611.078
512 3.97 89.64% 2.84% 7.52% 191589.609
1024 7.94 89.64% 2.84% 7.52% 191588.969
2048 15.87 89.64% 2.84% 7.52% 191589.547

raspi 2 numbers were in the 151000 range.

Dana Batali
Dana Batali2:25 PM

so it appears that the jetson has significantly better floating point performance as @Riyadth Al-Kazily guessed:

1042013.250 / 191615.531 = 5.43 faster

Dana Batali
Dana Batali2:35 PM

for completeness, i obtained numbers from beaglebone black:


LINPACK benchmark, Single precision.
Machine precision: 6 digits.
Array size 200 X 200.
Average rolled and unrolled performance:

Reps Time(s) DGEFA DGESL OVERHEAD KFLOPS
----------------------------------------------------
16 0.55 92.57% 2.66% 4.77% 41841.398
32 1.11 92.54% 2.65% 4.81% 41730.660
64 2.21 92.51% 2.66% 4.83% 41864.852
128 4.38 92.53% 2.64% 4.83% 42193.164
256 8.87 92.50% 2.67% 4.83% 41655.719
512 17.40 92.53% 2.63% 4.83% 42462.664

Dana Batali
Dana Batali4:17 PM

here's a work-in-progress spreadsheet looking at the different vision platforms... Happy to make it editable by @Darwin Clark or @Declan Freeman-Gleason if you provide me your gmail addresses.

https://docs.google.com/spreadsheets/d/1o-Ma2ZDfu3egSJ9etxWQ8HrdiuvEqC2BwkLKiqaqA/edit?usp=sharing

Dana Batali
Dana Batali4:17 PM

Darwin Clark
Darwin Clark5:39 PM

Dana, I'm if you want to share it with me. GCC is also saysing that -03 is not a valid command line option. What does -03 do? Other than that, I'm using the same steps. Jack Stratton PM'd me and told me to use add #incule <stddef.h>, which I also did, and I had the same result.

Dana Batali
Dana Batali10:51 PM

-O3 is optimization level 3... it should be a O not a zero

Dana Batali
Dana Batali10:57 PM

seems to me that your file may have been corrupted.. first lines look like this:


/

LINPACK.C ..etc..
etc..
/

Dana Batali
Dana Batali10:58 PM

where / is a c-style comment block starting and / is the ending

Dana Batali
Dana Batali10:59 PM

this is the clue i'm focused on:


linepack.c:1:6: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘attribute’ before ‘.’ token
NPACK.C Linpack benchmark, calculates FLOPS.


that is NPACK.c is supposed to be within a comment block.

2017-11-27
Declan Freeman-Gleason
Declan Freeman-Gleason8:30 AM

@Dana Batali I was able to compile on my laptop... If it is corruption maybe you could post a `sha1sum` of the file for Darwin to compare.

Dana Batali
Dana Batali8:54 AM

or he should just re-download and/or visually inspect

Dana Batali
Dana Batali8:55 AM

`sum linpacknew.c`
05852 22

2017-11-29
Declan Freeman-Gleason
Declan Freeman-Gleason6:53 PM

Jules Blythe
Jules Blythe7:19 PM

@Jules Blythe has joined the channel

2017-11-30
Dana Batali
Dana Batali6:04 PM

for what it's worth, i added a couple more slides in the programming 101 section to introduce basic java syntax. Hard to teach all of java in only a couple slides, but hopefully it will ease new programmers into reading the Logger example file.

Dana Batali
Dana Batali6:04 PM

also: there was a report of a problem with a slide, something to do with CANTalon... Can anyone point me to the bug? @Binnur Alkazily?

Binnur Alkazily
Binnur Alkazily7:38 PM

See exercise 1 part 2 reference for import line

Binnur Alkazily
Binnur Alkazily7:40 PM

Thank you!!

Declan Freeman-Gleason
Declan Freeman-Gleason9:45 PM

@Emma Lahtinen @Cory_Houser @Charlie Standridge @Josh Goguen @Willie Barcott @Austin Smith @Ulysses Glanzrock @Ryan Olney @Darwin Clark @Justice James @Adam Rideout Redeker We need you to make a GitHub account before the next meeting. Please go to https://github.com/join and sign up (you want the free account). First you must verify your email, then send me (over a Slack direct message) the username you chose so I can invite you to the Spartronics4915 GitHub organization.

Darwin Clark
Darwin Clark9:48 PM

@Declan Freeman-Gleason is there any standard for github profile picture?

Declan Freeman-Gleason
Declan Freeman-Gleason9:53 PM

@Darwin Clark Nope. Just use anything obviously unsavory.

2017-12-01
Dana Batali
Dana Batali8:56 AM

fixed! Thanks for the report!

Binnur Alkazily
Binnur Alkazily7:24 PM

Thank you!!

Binnur Alkazily
Binnur Alkazily7:28 PM

Mentors would like to know who is who, so please keep that in mind. Another thing to note that usually (not required but nice to have history of your work) people keep the same account/handle through their college career and use it in their resumes for reference. So, choose your account names/handles carefully.

Declan Freeman-Gleason
Declan Freeman-Gleason9:22 PM

If you want to keep a handle that you have, you can set your name in GitHub, which still allows us to identify you.

2017-12-02
Chris Rininger
Chris Rininger12:39 AM

I know the pi was one of our vision options - just saw this & thought I would share... https://www.theverge.com/2017/11/30/16720322/google-aiy-vision-kit-raspberry-pi-announce-release

Randy Groves
Randy Groves11:04 AM

Unfortunately, these are only available for pickup at Micro Center stores at this time. No indication when or if they will ship.

Randy Groves
Randy Groves11:06 AM

Field trip? Either Tustin, CA or Denver, CO are the closest :slightlysmilingface:

Dana Batali
Dana Batali11:27 AM

fyi: (reference the pi-vision thread above) our current operating approach is to use the jetson tk1 during development (atop opencv and python). Once we've settled on a vision algorithm (pipeline), we'll benchmark the same algorithm on the pi. If our algorithm can perform sufficiently well on pi, we'd select that on the basis of power consumption. The current thinking is that all this neural-net + machine learning material, while very interesting, is way overkill for typical FRC vision challenges. Here is the wip spreadsheet on platform selection (https://docs.google.com/spreadsheets/d/1o-Ma2ZDfu3egSJ9etxWQ8HrdiuvEqC2BwkLKiqaqA/edit?usp=sharing). Note that it shows the TK1 has ~5X floating point performance over the pi/jevois. We'll probably also need such a spreadsheet for algorithm selection. As always, feedback is welcome. (and fyi: @Declan Freeman-Gleason and @Darwin Clark are the current vision team)

Dana Batali
Dana Batali11:27 AM

2017-12-05
Chris Rininger
Chris Rininger12:11 PM

Sounds like direction is jetson... For possible future reference then, FYI a white paper was just published on FRC + jevois: https://www.chiefdelphi.com/forums/showthread.php?threadid=160325

Dana Batali
Dana Batali3:15 PM

watching this presentation turned up another platform we hadn't yet considered - android phone. 254 chose that over jetson because of power issues.

Chris Rininger
Chris Rininger4:41 PM

I remember reading about that... from 2015 I think & then several teams successfully used 254’s documentation to deploy the solution the following year

2017-12-06
Dana Batali
Dana Batali10:16 AM

they appear to have used the same approach in 2016 and 2017, judging from their github.

Dana Batali
Dana Batali2:55 PM

here is the source for their vision pipeline (opencv + .cpp + .jni):

https://github.com/Team254/FRC-2017-Public/blob/master/visionapp/app/src/main/jni/imageprocessor.cpp

summary: pretty straightforward combination of hsv ranges plus contour extraction

2017-12-07
Dana Batali
Dana Batali11:37 AM

@Darwin Clark @Declan Freeman-Gleason: it might be interesting to transcribe this to python so we can consider its benefits relative to simple blob detector

2017-12-08
Dana Batali
Dana Batali4:25 PM

another opencv vision pipeline here: (from team 1678):

https://github.com/frc1678/robot-code-public/blob/master/muan/vision/vision.cpp

Dana Batali
Dana Batali4:28 PM

(team 1678 uses jetson)

Declan Freeman-Gleason
Declan Freeman-Gleason6:05 PM

Reposting a great (but long) video from 254, this time about motion planning and control. If you're interested in advanced dead reckoning and the like, I recommend you watch this. https://youtu.be/8319J1BEHwM

2017-12-09
Dana Batali
Dana Batali11:44 AM

the one I posted above is one year newer and, thus, might be preferable.

Declan Freeman-Gleason
Declan Freeman-Gleason12:16 PM

@Dana Batali I thought this was on a different subject, correct me if I'm wrong (I've only watched a tiny bit of yours), but yours is about vision? The video I posted is about path following using encoders.

Dana Batali
Dana Batali12:18 PM

the video i posted is primarily about robot motion control, vision is a part of it. Since it's newer, you can sense the migration of 254's opinions from roll your own to use with cantalon... Perhaps it's worth watching both.

2017-12-13
2017-12-14
Dana Batali
Dana Batali12:40 PM

@Darwin Clark here's a whitepaper from team chillout discussing setup of raspberry pi for vision. In particular they show how to play with camera exposure settings outside of opencv. If we can accomplish this with opencv, we should but if not, I expect this recipe to work for jetson and pi (but haven't tried it yet):

https://github.com/MTHSRoboticsClub/Documentation/blob/master/Computer%20Vision%20with%20Raspberry%20Pi.md

A reminder: as we discussed last night, your goal was to get the same opencv pipeline as is present in the jni dir of Team254's vision_app running in our python/jetson environment. I'm happy to answer questions as you encounter them, just post them here with a @Dana Batali

Chris Rininger
Chris Rininger1:56 PM

@Dana Batali FYI - one more canned vision option: https://www.chiefdelphi.com/forums/showthread.php?threadid=160391 Expensive relative to others but marketed as easiest plug/play option available

Dana Batali
Dana Batali2:15 PM

@Darwin Clark: please check out the limelight thread! Other than the $400 pricetag, it seems ideal. Other potential downsides: if their hard-coded vision pipline doesn't map on to this year's game or vision challenge (it's a black box, like pixycam in this way).

Enrique Chee
Enrique Chee2:23 PM

Let me know if we want to purchase.

Dana Batali
Dana Batali2:24 PM

@Chris Rininger: for people who don't want to or can't afford to learn vision, this seems like an ideal solution. Even for those that have dabbled for vision, the combo of low latency and high framerate offers the unique ability to use it as a sensor in the control loop. This is the most attractive part of it. The 254 solution (written by one of the contributors to that thread) does lots of magic kinematics to compute where the targets are on the presumption that the data is a few 10ths of seconds old. There is a lot of code complexity and margin for error with that approach, but they delivered real results for the last few years. Other issues with this solution: it doesn't appear to be available yet. Interestingly, it uses the raspberry pi 3, so other than un-optimized opencv pipelines, we should be able to deliver the same performance with a pi. One other consideration: there is a rule regarding the budget for robots. It may be updated each year, but $400 seems like a fairly large tack-on. @Enrique Chee do you remember the approx guidelines for budget lmits on non KOP components?

Enrique Chee
Enrique Chee2:27 PM

I believe it is $500 per item for of the shelf item.

Enrique Chee
Enrique Chee2:27 PM

$5000 total for robot.

Dana Batali
Dana Batali2:47 PM

a great post within this limelight thread (from jared russel of 254): https://www.chiefdelphi.com/forums/showpost.php?p=1712864&postcount=92

Dana Batali
Dana Batali3:11 PM

@Enrique Chee: my sense is that $400 seems like a lot to spend on vision without assurances that it will play an important role in this year's game. That said: these folks definitely "get it" and have promised to provide updates during the build season that are game specific. Jared's threads above make a good case that this is a reasonable investment. There is a question of whether they'll be able to deliver on their promises to anyone who wants one. But of all the options we've looked at so far, this one stands out for being easy to integrate and designed for stability during real match play and fully integrated / turnkkey. I feel that the choice to purchase a couple of these units may be above my pay grade. If team leadership feels that the cost/benefit/risk aspects are acceptable, I'll be happy to help to make it work. If their sales pitch is accurate, my help won't be needed at all!

Enrique Chee
Enrique Chee3:17 PM

Thanks. I will wait for student programmers input. How about we just get one first ?

Dana Batali
Dana Batali3:19 PM

one will definitely work, second would only be needed as a backup or for a second robot which happens later

Declan Freeman-Gleason
Declan Freeman-Gleason4:08 PM

@Dana Batali I think that limelight is really attractive... The only real catch is that it's a "black box". It is encouraging that they can achieve 90 fps with a Raspberry Pi and NetworkTables; it basically confirms that those aren't fundamentally incapable of the task.

Dana Batali
Dana Batali4:09 PM

@Declan Freeman-Gleason - yes i learned that there is a tuning parameter for network table updates. If/when we get back to your latency tests, we'll want to make sure we tune that knob.

Dana Batali
Dana Batali4:09 PM

(defaults to 10fps)

Declan Freeman-Gleason
Declan Freeman-Gleason4:18 PM

@Dana Batali While reading the CD thread someone mentioned a `flush` method that is new this year. I believe that would be the preferred way to do it (e.g. call `flush` for every loop iteration), but we should really look more into it.

Dana Batali
Dana Batali4:20 PM

@Declan Freeman-Gleason - yes i saw that - it might not actually matter if the only things in the table are the vision parameters that are being flushed. My understanding is that values that aren't changed aren't sent during the update frame.

Dana Batali
Dana Batali4:28 PM

@Declan Freeman-Gleason: from http://robotpy.readthedocs.io/projects/pynetworktables/en/stable/api.html:

- flush exists since 2017 (but is "rate-limited")
- setUpdateRate is the means to establish the rate, default for c is 100ms, default for python is 50ms

Chris Rininger
Chris Rininger9:53 PM

Wow, I didn't know my FYI on Limelight would create such a stir.

My 2 cents on cost/benefit: One of our team's top strengths is resources/fundraising, and arguably one of our biggest limitations compared with our "peer" teams is time available. Our students' school workload is, I feel, quite a bit higher than average... That and other constraints (e.g. timing of finals) result in our team not being able to work 5 to 6 days a week during the build and competition seasons like many top teams do. So I would argue it is playing to our strengths if we utilize tools / parts that may cost a premium when the result is getting to a solution in less time.

This extends beyond a sensor/programming tool like this IMO... I think it may be worthwhile to outsource manufacture of certain parts, for example, to make them available sooner and free up time to spend on other things.

2017-12-15
Dana Batali
Dana Batali8:32 AM

Chris - i agree with all your points, but the one point of contention i have is that the goal of this STEM program is STEM and teams that invest in skill development are doing more for their students than teams that simply buy all their canned solutions. I always favor making this a learning experience over a winning one, though ideally both go together :slightlysmilingface:. Students and student leadership are generally surfing these tradeoffs across all subteams, so it's good to share our differing perspectives for their consideration. When it comes to limelight, I would say that a canned solution like this is no worse than the situations where mentors write the entire vision subsystem (as appears to have been the case with 254). I believe that building a limelight-like solution is within the skillset of our team, but as you say the time commitment is substantial, and there may be better ways for our students to invest their time.

Chris Rininger
Chris Rininger9:24 AM

Totally agree with the goal of learning being THE top priority, absolutely including applied STEM skills but also holistic solution delivery skills like awareness of organizational strengths & constraints, evaluating trade-offs in that context, & then making decisions that have the highest probability of meeting overall objectives. Multiple kinds of lessons to learn. I agree these ideally go together (it's on us mentors to make that happen, and I believe we can). I also feel like there's a bit of a difference during build season when the clock is ticking vs. the off-season and pre-season. I believe tenured teams focus on skills building between the end of competition season and the next build season, then in build/competition seasons they shift to prioritizing building the best robot possible & continuously improving it (learning still happening, but less the top priority - more about applying what has already been learned). My comment above is sort of in that context... an "in build season" perspective.

Dana Batali
Dana Batali10:20 AM

from the chillout programming webinar :https://youtu.be/Sq-wdLI3whk


Here is the link for the webinar on the new control sytem for 2018. If you're interested in showing others on your teams: 2018 Control System Webinar.

A link to the power point and other useful links are in the description of the video.

Please Note: In the webinar Rob states that the Talon motor controller libraries, now call Phoenix, are some of the biggest changes for the up coming year. Well, as of the new release by Cross the Road yesterday the change was even greater. Please take a look at the new libraries from Cross the Road, they will force you to re-think how you program the robot to move.

Rob's email is

- Scott Davidson
Chillout 1778

Declan Freeman-Gleason
Declan Freeman-Gleason4:53 PM

@Dana Batali Thread for the 254 porting effort.

Declan Freeman-Gleason
Declan Freeman-Gleason4:54 PM

Their code compiled and deployed fine out of the box, so it doesn't appear there are any compile time dependencies that I don't have. I'm going to work on the NavX conversion next (after a bit more reading through the codebase).

Dana Batali
Dana Batali5:40 PM

Nice! I'm a little surprised... Doesn't the drive subsystem depend on navx? The Kauai namespace?

Declan Freeman-Gleason
Declan Freeman-Gleason5:42 PM

It does. The library is bundled with the repo. See: https://github.com/Spartronics4915/FRC-2017-Public/tree/master/lib

Declan Freeman-Gleason
Declan Freeman-Gleason5:43 PM

That isn't to say it doesn't yell at me about it at runtime.

Declan Freeman-Gleason
Declan Freeman-Gleason5:44 PM

In idle mode it appears to run without any uncaught exceptions or the like though, it just spews about not being able to find the NavX.

Declan Freeman-Gleason
Declan Freeman-Gleason5:47 PM

The `jama` library is for linear algebra, `jetty` is for various HTTP support, and the `json` package is pretty self explanatory. I'm still not sure about where these are used.

Declan Freeman-Gleason
Declan Freeman-Gleason6:46 PM

It turns out that this robot doesn't have an IMU plugged in, so that effort is stalled until Monday when I can take the one from last year's robot (It's my impression that we only have one IMU, is that true @Peter Hall?)

Dana Batali
Dana Batali7:49 PM

I'm certain we have at least two IMU... Right @Clio Batali? (shot in the dark)
Json is used for comm with vision, FYI
Great job so far!

Dana Batali
Dana Batali8:21 PM

ideas on next steps:
1. we need an imu :slightlysmilingface: (even if we need to steal it off the primary robot)
2. perhaps its worth cloning their Drive into Drive2, then deleting the navx and gear shifting... Then replacing Drive with Drive2 in all locations, then deleting navx from the repo (or renaming it so we're sure its not needed). You'd still be able to refer to the original whilst moving forward on a real road to our immediate future.

Have you been able to "drive"? (even on blocks)

Dana Batali
Dana Batali8:24 PM

also have you figured out how to run in "test" mode? (cf: testInit())... I believe there's a way to do this via driverstaion?

Declan Freeman-Gleason
Declan Freeman-Gleason9:17 PM

I've actually already started removing the NavX. I just deleted `mNavX` and replaced the areas that had compilation errors with comparable BNO055 code. The issue with this is that the BNO055 doesn't really have a good way to reset/zero itself. I looked through the BNO055 reference PDF and the library we use and I found a way to reset it. That's where I got blocked because I didn't have an IMU to test this reset functionality on.

Declan Freeman-Gleason
Declan Freeman-Gleason9:19 PM

I think I'm just going to modify `Drive` directly, because it cuts out a bunch of refactoring... After all, I have their Github to refer to.

Declan Freeman-Gleason
Declan Freeman-Gleason9:20 PM

My plan for tomorrow is to try running on blocks, along with test mode (I recall seeing it as an option along with autonomous and teleop in the driver station), and I will remove the NavX library itself just to be sure. I was pretty exhausted today, so I called it quits after getting relatively little done.

Dana Batali
Dana Batali11:13 PM

Sounds like you made great progress!!

Dana Batali
Dana Batali11:13 PM

I like your plan of attack

2017-12-16
Declan Freeman-Gleason
Declan Freeman-Gleason12:06 PM

The robot can run on blocks! I also tried out test mode, which works. I've converted everything to use the BNO055 (see https://github.com/Spartronics4915/FRC-2017-Public/commit/03f63959fe4e0f7a35e4d16aa85ebd914da42527), and I've added a notes file that I'll use to track my discoveries (see https://github.com/Spartronics4915/FRC-2017-Public/blob/master/notes.md) It's now spewing about not being able to find `/usr/bin/adb`, which means we have to install that manually.

Dana Batali
Dana Batali12:09 PM

great news!!! I advise you tread with caution wrt adb. I noticed that the installer script performs a `rm -rf *` prior to installation. Another investigation might be to try to figure out how to disable the adb-dependency as we're still far from certain that we'd choose that means of vision communication.

Dana Batali
Dana Batali12:13 PM

it might be as simple as comment out the VisionServer init in Robot.java

Dana Batali
Dana Batali12:14 PM

(as well as a couple of other lines there)

Declan Freeman-Gleason
Declan Freeman-Gleason12:17 PM

I'm testing that now... It should work, because the only place that calls `VisionServer.getInstance` is Robot init and `VisionServerTest.java` (which appears to never get constructed, so I think we're OK on that one).

Declan Freeman-Gleason
Declan Freeman-Gleason12:30 PM

@Dana Batali I was able to successfully remove the `VisionServer` spew. Now we're getting yelled at by CTRE because it can't find all the devices on the CAN bus. They have a lot of Talons, so I can't just assign them all. It's not really a serious issue though. Driving is exhibiting weird behavior... Would that be a good next step, or is there something else you can think of that would be good?

Dana Batali
Dana Batali12:33 PM

perhaps you can disable all the subsystems except drive? We'd need to disable solenoid etc there... I think the primary first goal is to see if you can get a manual drive working (then path following)... So if you can get rid of as much noise/distraction that'd be good. The subsystem manager and the Superstructure w ould need to get lots of commenting in addition to the obvious getInstance calls in Robot

Dana Batali
Dana Batali12:33 PM

(i guess this is really a question of where does the weird behavior come from?)

Dana Batali
Dana Batali4:05 PM

@Darwin Clark @Declan Freeman-Gleason: i just checked in a script called testCam.py under python. It shows how to disable auto-exposure and establish a exposure value with the intent of making bright things white and all other things black. While doing this I learned that the microsoft hd 3000 cam can maximally output 30 fps (even at low res). If we wish to be competitive against limelightvision, we'll need a camera that does better than that. Of course the raspberry pi 2 camera appears to achieve such higher rates.

2017-12-17
Declan Freeman-Gleason
Declan Freeman-Gleason8:53 AM

@Dana Batali Does the picam support the same adjustments?

Dana Batali
Dana Batali9:32 AM

I haven't verified that this is the case, but I'm confident we'll find a way to do this with picam

Dana Batali
Dana Batali9:33 AM

other alternatives on jetson would be a faster usb camera - here i think the usb isn't the problem, just that the lifecam is about web & resolution, not fast frame times.

Declan Freeman-Gleason
Declan Freeman-Gleason10:07 PM

Alright, I've removed pretty much all game specific subsystems and a lot of other extraneous game specific code. Driving works well now, I think the issue was that I misunderstood the expected behavior (`CheesyDrive` is really cool!), and the controller mappings were wrong. There is no more spew now, which is good. I'm going to try to get path planning working next.

2017-12-18
Declan Freeman-Gleason
Declan Freeman-Gleason1:24 PM

@Dana Batali I answered all of our codebase evaluation questions. You can see them in this private gist: https://gist.github.com/pietroglyph/9625b008083ef91c5d53df9c12d4e7a6

Declan Freeman-Gleason
Declan Freeman-Gleason1:44 PM

Enrique Chee
Enrique Chee3:19 PM

We have extra imu

Dana Batali
Dana Batali3:22 PM

@Declan Freeman-Gleason - excellent progress! I would say you are definitely ahead of the game and in a position to assess the question of how this compares against vanilla WPi libs?

Declan Freeman-Gleason
Declan Freeman-Gleason3:58 PM

@Dana Batali I think there are a lot of small niceties that add up, which I really like--they often fix some annoyances or issues we had last year, or just do something better. The codebase seems like something that is well suited to more complexity than vanilla WPILib in general. I'm also not entirely sold. Although I'm beginning to get a handle on the codebase, I think that it's really risky to push something so new on everyone at the start of build season--if only 2 out of the 12 students and mentors understand the new system, we might have a problem. It's mostly the additional complexity, and difficulty of adopting an new system that worries me. If I can get the repo in a ready-to-go state with path planning working, and if I can easily implement a new subsystem, then I will be ready to accept this.

Dana Batali
Dana Batali4:02 PM

@Declan Freeman-Gleason - i suppose its a good idea to be a bit conservative - that said there are other questions to consider:

1. do these features of the 254 codebase represent significant competitive capabilities. Answer probably depends on 2018 game, But obvious candidates include: path following, odometry and pose estimation, vision integration (this would be de-valuated if we can get control-loop vision going)

2. in last year's team how many students and mentors could be described as understanding the WPI libs? My sense is that most don't, but that nests that the system provides can be occupied and improved with only a little understanding of the larger workings. Which leads to #3:

3. does 254's code allow for the building of such nests for subsystem and command development? This is what you're exploring when implemening a new system. Along with that challenge, it makes sense to evaluate the command and ctl aspect of that.

Declan Freeman-Gleason
Declan Freeman-Gleason4:42 PM

1. Yes, there are definitely competitive advantages. Cheesy Drive, path following and planning, and vision come to mind. We will also be able to integrate anything new 254 does in the future.
2. To be honest, I don't know. I felt like I had a pretty good understanding of the system, but I can't speak for anyone else.
3. To be determined. It seems like things are pretty similar; the only thing that really encroaches upon the "nest" is `Superstructure`. In a way, that's the only really new concept people actually have to learn.

2017-12-19
Dana Batali
Dana Batali1:36 PM

@Declan Freeman-Gleason, @Enrique Chee - my quick perusal of the pigeon imu suggests to me that this is a reasonable choice for us this year. API is straightforward, price is only a couple bucks more than the adafruit BN0055n and most significantly, it can tie into the CAN bus. (which, one hopes, is a good thing). Reboot/reset time is 5sec, so we need to be careful about when a reset is issued.

Declan Freeman-Gleason
Declan Freeman-Gleason2:10 PM

@Dana Batali I got the Pigeon working on the 254 codebase yesterday. It's fine. There are a few things I don't really like about the API, but it's not a big problem.

Dana Batali
Dana Batali2:12 PM

@Declan Freeman-Gleason cool! Did you wire it into CAN?

Declan Freeman-Gleason
Declan Freeman-Gleason2:16 PM

It's currently connected to the sensor port on a Talon (where a encoder might get plugged in). Connecting to CAN shouldn't be too hard, but it requires a bit of soldering (which I'm not entirely comfortable with).

Dana Batali
Dana Batali2:17 PM

another point i just ran into; it must be mounted with z-up... Had you seen this?

Declan Freeman-Gleason
Declan Freeman-Gleason2:28 PM

You mean pointing up?

Dana Batali
Dana Batali2:37 PM

this is what I found amusing: section 12.1 states that z is up but I haven't found a description of their coordinate system.

Dana Batali
Dana Batali2:37 PM

(imu user's guide)

Dana Batali
Dana Batali2:46 PM

quite likely z is the normal to the board with chips facing up... Also: as with all imu's mounting near the center of rotation is important.

Declan Freeman-Gleason
Declan Freeman-Gleason2:50 PM

It seems to work fine with chips up at (nearly) the center.

2017-12-20
Dana Batali
Dana Batali9:26 PM

does that make this a knot?

Dana Batali
Dana Batali9:26 PM

:slightlysmilingface:

2017-12-23
Chris Rininger
Chris Rininger5:33 AM

Interesting twist in this year’s fierce competition for FRC vision market share... See post 104 forward... https://www.chiefdelphi.com/forums/showthread.php?t=159883&page=7

2017-12-26
Randy Groves
Randy Groves9:43 AM

And further - looks like the 2073 team released their code for 'EagleTracker': https://github.com/team-2073-eagleforce/2073-jevois-vision-suite

2017-12-27
Declan Freeman-Gleason
Declan Freeman-Gleason8:21 PM

@Dana Batali I just got path following working. I took a bit of a break last week, but I want to get this repo ready for use in the next few days, which at this point is mostly refactoring. In that regard I need to (at least) do the following: port to the new Talon libraries, adapt to use our web dashboard, refactor the package names to `com.spartronics4915`, remove remaining game-specific code, and document the new system more. I'll post a demo video soon.

2017-12-28
Declan Freeman-Gleason
Declan Freeman-Gleason2:23 PM

Dana Batali
Dana Batali2:30 PM

NICE!!! i agree there are some small concerns with the termination conditions. I'm a little more concerned by the angle, than the distance (as you seem to be). I would be curious to play with the velicities at each waypoint to see how that affects these errors. Hard to tell with only 3-4 waypoints whether the final tangent is parallel to the original heading.

Dana Batali
Dana Batali2:34 PM

Some questions wrt refactoring etc. Since you've more or less proven that this codebase gives us a serious advantage over pure WPI, the question for spartronics remains: are there nests where we can comfortably place new programmers. Have you had a chance to evaluate that question? Only after you've concluded that the answer is "indeed" would it make a ton of sense to perform all the refactoring. Specifically - I can't imagine more than two experienced programmers working within the Superstructure subsystem. On the other hand, I can easily imagine new programmers "owning" a simple subsystem and its associated actions. What are your thoughts regarding chaos management in this setting?

Declan Freeman-Gleason
Declan Freeman-Gleason7:54 PM

As you said, `Superstructure` is the sticking point here. This is the only new thing that encroaches upon the "nests", IMO, because most of the other stuff seems analogous to what we had in vanilla WPILib or easy to adapt to. I think that there are two options (if I understand the "two experienced programmers" bit as meaning it's going to be challenging for less experienced people): have a discrete step after the rest of the features are implemented (within the subsystem) where a student works with a mentor to integrate with `Superstructure`, or have a dedicated student who wrangles all the subsystems together in superstructure. I prefer the former.

Declan Freeman-Gleason
Declan Freeman-Gleason10:04 PM

I've also already refactored package names, and autonomous choosing and some debug output works with our dashboard (but I haven't incorporated `Logger`, I'm unsure about its value, and it's a rather time consuming task). I couldn't find the new Talon libraries that the Chill Out webinar talked about, and we still need to get our build system and Travis working. This is what needs to be done if we chose to use this system, I think.

Declan Freeman-Gleason
Declan Freeman-Gleason10:05 PM

@Dana Batali Before we make a decision to switch codebases or not, I think that we (and probably @Binnur Alkazily, @Ronan Bennett @Noah Martin and any other interested mentors) should get together again and talk about what you and I have found out. This should probably happen in the next week if we think this is a good idea (and if @Enrique Chee can accommodate such a meeting).

2017-12-30
Dana Batali
Dana Batali4:09 PM

@Declan Freeman-Gleason: i agree that building consensus on this decision is important. For my part, I'm pretty convinced this is the way to go. I just want to encourage the student programming leaders to consider how this change of codebase suggests a change to how we organize and schedule programming deliverables. I discussed with with binnur before the holidays and she seemed to be on the same page pending your holiday results (which appear to be resounding successes).

Dana Batali
Dana Batali4:12 PM

@Declan Freeman-Gleason: i notice that you've been submitting changes to our fork of the 254 codebase. I think that's the right thing to do. But I would highly suggest that you make modifications to the base readme to describe the changes and potential motivations. Since this is a public repo - we don't want visitors to think it's a simple clone of their code. And we certainly do want to give them credit... Perhaps a leading paragraph is all that's needed. Which leads to the next question: if we decide to adopt this codebase, would we fork it into the repo named 2018-Powerup?

Declan Freeman-Gleason
Declan Freeman-Gleason4:58 PM

@Dana Batali I think we would probably put it into 2018-POWERUP... We seem to be following that convention; is your question more about if we should actually fork or just copy?

2017-12-31
Dana Batali
Dana Batali11:52 AM

@Declan Freeman-Gleason: my concern is simply that we preserve the revision history that came with their code base as we evolve it season to season.

Declan Freeman-Gleason
Declan Freeman-Gleason12:53 PM

I'll add the fork as a remote and `pull`.

Dana Batali
Dana Batali2:39 PM

Perfecto.... BTW I updated the readme

2018-01-01
Binnur Alkazily
Binnur Alkazily12:05 PM

Catching up with the tread. @Declan Freeman-Gleason excellent progress! Love the demo videos!!

2018-01-02
Dana Batali
Dana Batali9:24 AM

fyi: i renamed testCam.py to testUSBCam.py

Paul Vibrans
Paul Vibrans9:34 AM

Could Declan's file be presented in a flow chart for visual learners like me?

Darwin Clark
Darwin Clark9:47 AM

@Dana Batali this would be at a meeting, yes?

Dana Batali
Dana Batali9:53 AM

@Paul Vibrans - this would be a great exercise and the results might even be valuable. Downside is constructing a flow chart would take many (, many) hours. Basic idea: understanding the code is more like understanding hyperlinks within a complex web document than the flow of logic for a single, particular, action. A series of flow-charts would probably be required to convey all the hyper-ness of the software. :slightlysmilingface:

Dana Batali
Dana Batali9:57 AM

@Darwin Clark - it's likely that organized/sanctioned meetings are the only time/place to get more than a couple of people together. It's also probable that with kickoff imminent, the highest priority is for students to focus on the game and strategy. If there is sufficient interest (pending a decision to actually commit to 254's codebase - which may be imminent), we might want to try to schedule a separate 254 codebase study/brainstorming session. I'll be happy to help with that if there's sufficient interest.

Paul Vibrans
Paul Vibrans11:21 AM

I was thinking the complexity of the problem might require something drawn in 3-D CAD that would be viewed best on a computer screen.

Darwin Clark
Darwin Clark11:37 AM

@Dana Batali Okay, keep me posted about those meetings.

2018-01-03
2018-01-07
Chris Rininger
Chris Rininger10:12 AM

Curious about vision: Anyone thinking cube detection to speed up the “grab cube” task in both auto and tele? Bright yellow makes me think game designers had it in mind.

Dana Batali
Dana Batali10:18 AM

that plus an autodrive capability to place the cube into the small exchange portal were discussed

Paul Vibrans
Paul Vibrans11:12 AM

There should be a diagram of this that pops up immediately on the drivers station

2018-01-08
Grant
Grant7:45 PM

@Grant has joined the channel

Grant
Grant7:46 PM

@Grant has left the channel

2018-01-09
Darwin Clark
Darwin Clark4:47 PM

@Darwin Clark pinned a message to this channel.

2018-01-10
Enrique Chee
Enrique Chee10:57 PM

Back to limelight ? http://www.wcproducts.net/WCP-0150

Chris Rininger
Chris Rininger11:15 PM

Even if we end up doing rPI or Jetson or whatever, is the limelight something that could be used as a tool for less advanced programmers to learn initial aspects of vision as a stepping stone to the less canned solutions? And is it an insurance policy in case we run into blockers with our target solution?

2018-01-11
Dana Batali
Dana Batali12:04 AM

My opinion is that we don't need limelight. The hard part is integrating vision into the control system and limelight doesn't solve this problem... @Darwin Clark mentioned he already has a pipeline going on his jetson.. it's possible we'd need to port it to the raspberry pi since it offers faster n cheaper camera options. I don't see advantages in training, etc given our current level of commitment

Darwin Clark
Darwin Clark12:34 PM

Dana - I've been starting to work on cube tracking at home, with a cube that Chee let me borrow. I personally think that vision is something that is worth it for me because the time spent learning the 254 system would be effectively be time lost that could have been spent working on a vision pipeline. Vision has it place in auto, as well as teleop. Thoughts?

Declan Freeman-Gleason
Declan Freeman-Gleason12:55 PM

If we go with shooting cubes, tracking the scale position could be more important... The Ri3D videos make intaking/the exchange easier than we expected.

Declan Freeman-Gleason
Declan Freeman-Gleason12:56 PM

I think the hardest part is integration. You should consider that when allocating your time.

Darwin Clark
Darwin Clark1:01 PM

That was the biggest challenge. I skipped around the "are we using Jetson" and "how to integrate" because time seemed to be spent working on the 254 porting. I just began work on something that I KNOW what to do.

Dana Batali
Dana Batali1:01 PM

@Darwin Clark - i wholeheartedly agree that vision is a valuable capability for our team - and currently you are the only programmer who we've identified to be dedicated to this. Certainly targeting yellow cubes is the likely suspect and being able to approach them quickly and reliably has value for both auto and tele. The question of whether limelight helps us or hinders us is reasonable to ask precisely because vision appears quite relevant in this game. Since we already achieved a basic level of success identifying targets, I believe limelight doesn't bring incremental value to our team at thispoint. Speed, too, is crucial and this is why I continue to emphasize fast framerates (target: 100fps)... Combining a simple vision pipeline with rapid target delivery to the robot control is the way to maximize success, i reckon. For this reason, I believe you'll need a modicum of awareness to the 254 control system...

Finally, and most importantly: the equipment requirement for vision is fairly high: we need regular access to a drivetrain as soon as possible. I expect we could share it with the drivetrain owner(s), but we'll need to try to avoid contention with other manipulator programmers. So, minimally, if we want vision, we need a drivetrain+roborio pretty continuously during build season or at least 'til we achieve success. Manip subsystems will also need access to a testbed (roborio, solenoids, motor controllers) very soon too.

Dana Batali
Dana Batali1:03 PM

And I agree with @Declan Freeman-Gleason that getting something into the exchange looked reasonably straightforward - but there's always the driver's visibility concerns. Another aspect of vision is to ensure that driver cams are useful (lens choices, framerate maximization, latency minimization)..

2018-01-12
Ronan Bennett
Ronan Bennett11:06 PM

@Adam Rideout Redeker @Charlie Standridge @Darwin Clark @Emma Lahtinen If you were NOT at the meeting today, it would be helpful if you could to the following before the next meeting (it will not take long at all)
1.) Go to Github and make sure you’re signed into your Github account
2.) In the upper right click on Your Profile, then Repositories, then 2018-POWERUP, then Settings
3.) Scroll all the way down and delete the repository
4.) Go to https://github.com/Spartronics4915
5.) Click on 2018-POWERUP. It should say Private next to it. (If you don’t see this repo, then you are not part of the Spartronics4915 organization - please send me your github name in slack and we will add you)
6.) Press fork in the upper right and confirm

Emma Lahtinen
Emma Lahtinen11:09 PM

@Emma Lahtinen has left the channel

2018-01-13
Dana Batali
Dana Batali11:31 AM

@Declan Freeman-Gleason: looking at the robot initialization sequence at the moment... I see a few things that seem a little off to me. Also I'll include a few code observations here:

1. the Drive has a static initialization that may happen sooner than we'd like. We should search and destroy the coding pattern for "singletons" where the static instance variable is constructed as a side effect. Instead, I believe we should employ "lazy construction". In Drive.java, here's what I mean:


private static Drive mInstance = null;

private static final int kLowGearPositionControlSlot = 0;
private static final int kHighGearVelocityControlSlot = 1;

public static Drive getInstance()
{
if(mInstance == null)
{
mInstance = new Drive();
}
return mInstance;
}


Other observations:

we should never call super.method() unless we've overridden the method. Instead with should invoke the method via method() or this.method().

WPI_TalonSRX implements setInverted and routes it to super.setInverted(). It's parent class, TalonSRX doesn't implement setInverted() but the parent of TalonSRX, BaseMotorController does, where it routes to the JNI SetInverted call.

* the infinite loop we saw yesterday had the smell of something happening during construction right?

Dana Batali
Dana Batali12:40 PM

other offenders of lazy initialization guideline:
RobotState
RobotStateEstimator
VisionProcessor
PathAdaptor:CompBot(), PracticeField()
DriveSignal
JSONParser
BNO055
MotionState

etc... Obviously some are more important than others. The concern associated with static construction is that there is no (or very obscure) guarantees on execution order. In the context of CAN devices, we wouldn't want to perform CAN operations (say CANTalon) until/unless the robot's CAN initialization code has already run.

Dana Batali
Dana Batali12:53 PM

looking at Robot.java:

Robot constructor has a log message that should result in bytes written to file. Member variable initialization likely precedes the call to `CrashTracker.logRobotConstruction()`, so if we never see this file:

1. there are issues with creating and writing to the file
2. we get an exception during Robot construction.

#2 seems more likely than #1, to narrow things down, we might want to consider migrating all the member initializations into the constructor and peppering some print statements there.

Dana Batali
Dana Batali1:03 PM

looking at wpilib: RobotBase.java:

main():
1. load app manifest to determine class name of robot.
2. construct an instance of the robot:

System.out.println(" Robot program starting ");
RobotBase robot;
try {
robot = (RobotBase) Class.forName(robotName).newInstance();
} catch (Throwable throwable) {
Throwable cause = throwable.getCause();
if (cause != null) {
throwable = cause;
}
DriverStation.reportError("Unhandled exception instantiating robot " + robotName + " "
+ throwable.toString(), throwable.getStackTrace());
DriverStation.reportWarning("Robots should not quit, but yours did!", false);
DriverStation.reportError("Could not instantiate robot " + robotName + "!", false);
System.exit(1);
return;
}

Dana Batali
Dana Batali1:04 PM

so this is where those messages come from!

Dana Batali
Dana Batali1:09 PM

@Ronan Bennett: if you get a chance, can you make a sticky post with all the active members of the programming team and their associated github ids? Something like this:

@Dana Batali: dbadb
@Ronan Bennett: ronan33
@Declan Freeman-Gleason: petroglyph

Declan Freeman-Gleason
Declan Freeman-Gleason1:31 PM

Interesting...

Declan Freeman-Gleason
Declan Freeman-Gleason1:32 PM

So far, I'm having an issue where the 254 code runs for ~10 seconds and then just stops (if you run it from ssh). If you deploy the code and let it run normally, the driver station reports that it has no robot connection (even though you can ping and networktables works).

Declan Freeman-Gleason
Declan Freeman-Gleason1:33 PM

Some basic code from the wpi wizard runs fine, unless the 254 code was deployed in a certain way before the stock code was.

Declan Freeman-Gleason
Declan Freeman-Gleason1:33 PM

If the 254 code was deployed before, the driver station still can't connect.

Dana Batali
Dana Batali1:49 PM

hmm - so we're really messing things up... Seems like we need to see deeper into Robot() - by migrating the instance-var construct into the formal constructor and sprinkling prints. Also: want to make sure Drive isn't constructed before Robot

Ronan Bennett
Ronan Bennett2:44 PM

Github usernames for active programmers:

@Adam Rideout Redeker adam-rr
@Austin Smith AustinTSmith
@Binnur Alkazily binnur
@Cory_Houser CoryHouser
@Dana Batali dbadb
@Darwin Clark loqoman
@Declan Freeman-Gleason pietroglyph
@Josh Goguen joshgoguen
@Justice James jamesjus000
@Mike Rosen mrosen
@Michelle Dalton mdalton-spartronics
@Mark Tarlton mtarlton
@Martin Vroom NinjaDuck9000
@Noah Martin darklink2458
@Randy Groves randomgrace
@Riyadth Al-Kazily riyadth
@Ronan Bennett ronan33
@Ryan Olney RyanOlney

Ronan Bennett
Ronan Bennett2:45 PM

@Ronan Bennett pinned a message to this channel.

Darwin Clark
Darwin Clark2:57 PM

@Ronan Bennett That is the correct handle for me.

Declan Freeman-Gleason
Declan Freeman-Gleason3:10 PM

It was another stack overflow. I caused it when I changed `super` to `this`. It's hard to find the actual stacktrace anywhere, so it makes stack overflows hard to debug. My takeaway from this is that `/var/local/natinst/log/FRC_UserProgram.log` is your best friend if you can't figure out what's going on.

Declan Freeman-Gleason
Declan Freeman-Gleason3:11 PM

I have it driving, I've fixed some unit issues, but I still can't get the left wheel to turn.

Declan Freeman-Gleason
Declan Freeman-Gleason3:12 PM

More concerning is the fact that the connection occasionally drops. There's also some weirdness with the driver station, but I suspect it's just a bug with the unofficial qdriverstation that I'm using.

Declan Freeman-Gleason
Declan Freeman-Gleason3:14 PM

The connection dropping may be my computer, so I'm going to try a different wireless network adapter.

Mike Rosen
Mike Rosen3:55 PM

Hey, gang, sorry about this but I've got a conflict tomorrow, Sunday and won't be at our meeting.

Dana Batali
Dana Batali5:09 PM

@Declan Freeman-Gleason - i saw your checkin. Was it the static initialization of singletons? If so, did you identify the actual culprit?

Dana Batali
Dana Batali5:10 PM

(glad you combined/eliminated LazyCANTalon)

Declan Freeman-Gleason
Declan Freeman-Gleason6:41 PM

I reintroduced a stack overflow bug when I changed `super` to `this`.

Declan Freeman-Gleason
Declan Freeman-Gleason6:44 PM

I've fixed the left motor not turning. When we configured the `configFwdLimitSwitchNormallyOpen` we set the `LimitSwitchSource` to a remote connector, which caused a sticky fault. I've corrected the setting, so no more sticky faults. The silverlight page and the motor lights were key to diagnosing this.

Randy Groves
Randy Groves6:47 PM

Randy Groves == randomgrace

Dana Batali
Dana Batali7:13 PM

So I'm uncertain as to the current status. Are we "fully operational"? Are you working on templates for tomorrow?

Declan Freeman-Gleason
Declan Freeman-Gleason7:29 PM

Path planning still isn't working. I'm going to start on the templates in a little while.

Declan Freeman-Gleason
Declan Freeman-Gleason7:30 PM

I also had to port the dashboard to the 2018 networktables.

Declan Freeman-Gleason
Declan Freeman-Gleason7:30 PM

`CANTalonFactory` is working again too.

Dana Batali
Dana Batali8:00 PM

On super vs this... Any method of CANTalon that invokes a parent class method of the same name must call super. if that's all we do in a CANTalon method, we should not be overriding the parent class method in the first place. If I have a method bar() whose implementation invokes this.bar() we have an infinite recursion.

Declan Freeman-Gleason
Declan Freeman-Gleason9:24 PM

I understand why we have infinite recursion. What was your rationale for wanting to change `super` to `this` except where we had a method of the same name in the child class and superclass? If we have super everywhere, we don't need to worry about infinite recursion (unless there's something about `this` vs. `super` that I'm missing in this regard.)

Declan Freeman-Gleason
Declan Freeman-Gleason10:58 PM

I'm starting to wonder if `CANTalon` should really extend `WPI_TalonSRX`... We're replacing, not extending the behavior of the aforementioned superclass, and having new and old methods available side-by-side creates the problem of having to choose between the new and old versions. Not to mention that calling something in the superclass can result in a state change being lost to the subclass.

Declan Freeman-Gleason
Declan Freeman-Gleason10:59 PM

I've also gone ahead and changed everything in `Constants` to final... Constants should really be constant.

Declan Freeman-Gleason
Declan Freeman-Gleason11:08 PM

@Dana Batali I would like your opinion on removing `WPI_TalonSRX` as `CANTalon`'s superclass. (Now that I think about it more, if there are methods in the superclass that screw up the state in the subclass, then we should just override them. Then it's just up to the API consumer to decide if the old or new method is better.)

Dana Batali
Dana Batali11:24 PM

Wpi_talon implements motor safety etc, iirc

Dana Batali
Dana Batali11:31 PM

Re:super vs this... Seems like a minor miscommunication... We can discuss tomorrow. I agree that state management conflicts would be bad... I would think the safe thing for now is for CANTalon to only implement methods not implemented by the superclasses.

Dana Batali
Dana Batali11:35 PM

again, the typical use for super is for an overridden/intercepted method to invoke the parent functionality. This isn't very common and in a way breaks some of the advantages of class hierarchy abstraction s

2018-01-14
Dana Batali
Dana Batali8:57 AM

@Declan Freeman-Gleason: nice late-night work! I've synced both repos and can report that they build successfully. I wonder if we should delete the javadocs? I'll start peeking at your examples/templates

Dana Batali
Dana Batali9:04 AM

@Declan Freeman-Gleason - i notice that you didn't pull the example templates into 2018-POWERUP. Was this intentional?

Declan Freeman-Gleason
Declan Freeman-Gleason11:30 AM

Yes, it was intentional. I've added a blank subsystem for them to drop their code into in the 2018-POWEUP repo. They can then refer back to the example, which will permanently live in 254Base.

Dana Batali
Dana Batali11:32 AM

poifect!

Dana Batali
Dana Batali11:33 AM

i'm going through the sync process... Probably a good idea to make that the first order of biz for the students?

pull upstream (did we define upstream via git remote add?)
push to origin

Dana Batali
Dana Batali11:34 AM

perhaps another good exercise is to create a tag on the master at this juncture?

Declan Freeman-Gleason
Declan Freeman-Gleason12:29 PM

I can't remember if we talked about upstream or actually added it as remote... That will be something we do.

Declan Freeman-Gleason
Declan Freeman-Gleason12:29 PM

Do you mean a tag on 2018-POWERUP?

Binnur Alkazily
Binnur Alkazily4:47 PM

Chris Rininger
Chris Rininger7:35 PM

Potentially useful tool for planning auto (& even tele) robot routes shared: https://www.chiefdelphi.com/forums/showthread.php?threadid=161452

Enrique Chee
Enrique Chee7:48 PM

THANKS !!!

2018-01-15
Josh Goguen
Josh Goguen10:05 AM

I'm going to be about 15 minutes late. Sorry.

Riyadth Al-Kazily
Riyadth Al-Kazily4:32 PM

There are new Slack channels for the engineering design of each of the major modules: scissor lift, intake grabber, and climber. I suggest everyone on programming take a look at what is coming our way, and if you know which module you will be working on you should subscribe, monitor regularly, and ask questions. It is important to get an idea how these things are going to be controlled by software before they are done with the design phase, as you may not be able to ask for changes later.

2018-01-16
Enrique Chee
Enrique Chee10:37 AM

Thanks Chris. Units are in inches.

Dana Batali
Dana Batali11:44 AM

@Declan Freeman-Gleason @Riyadth Al-Kazily @Peter Hall: I'd like to resolve an issue that crosses boundaries between programming and electronics teams. The topic is the Pigeon IMU. Here's the sitch as i understand it (declan, please correct).

- the pigeon imu is central to all autonomous actions
- we'd like our software to work with the pigeon on a number of chassis/robots
- we only have two pigeons
- we'd like to easily move a pigeon from one chassis to another
- i understand that it can be connected directly to the CAN bus - but there's another way to piggy-back off an existing CANTalon (?).
- if we want it to be a first-class CAN entity, I understand that some special wiring may be required (even soldering!), this hasn't yet been done.
- if we want to continue to piggyback off a Talon, it may be that some method for ensuring that the connector is robust across multiple moves.

thoughts?

Declan Freeman-Gleason
Declan Freeman-Gleason12:28 PM

The connector is the same one that you might plug an encoder into... The female end is on the Pigeon, and it was easy enough for me to connect (it also seemed reasonably sturdy) that I think it's the best way to go if we're moving it around a lot (especially because CAN wiring appears to be a pain.)

Dana Batali
Dana Batali1:24 PM

@Declan Freeman-Gleason - can you clarify whether we would need an extra CANTalon with this approach? Or would we simply always plug it into, say, the right follower talon?

Dana Batali
Dana Batali1:26 PM

: here's a visual map (work in progress) of the pins and devices and subsystems we're working with. I imagine @Ronan Bennett might try to keep this diagram (or a similar one) in sync over the build season.

https://docs.google.com/presentation/d/1SacpgoWj3IzrbT54iJLWIOtRL7Q4cOxLmkvhCvlwkU0/edit?usp=sharing

Note that the PCM devices are in the code but currently not functional.. We'll clearly be modifying these as the pneumatic requirements become more clear.

Dana Batali
Dana Batali1:26 PM

Peter Hall
Peter Hall2:16 PM

I will have to look at the IMU I think that it can be hooked up to a motor controller but I am not sure.

Declan Freeman-Gleason
Declan Freeman-Gleason2:16 PM

We could easily put it on a follower Talon.

Dana Batali
Dana Batali2:24 PM

@Declan Freeman-Gleason can you make a firm policy decision regarding this with @Peter Hall? I only want to re-emphasize that a fast, robust, repeatable swap is the priority at this time. Ideally - an IMU probe function will be available for software validation. It's quite likely that we already have all this working, but looking at the current code, it reads like we've devoted an entire Talon for the sole purpose of the bridge (ie: there's a separate controller id).. This seems wasteful if it's the case.

Peter Hall
Peter Hall3:03 PM

you can hook up the Pigeon IMU to a talon just like you do an encoder. I don't know the practicality of it but physically it is possible.

Dana Batali
Dana Batali3:12 PM

do you have an opinion on whether the direct-to-can or via CANTalon is preferred? (measured as quick to change and robust) (and yes, declan indicates that he's already hooked the Pigeon to the talon without the direct-to-can).

Declan Freeman-Gleason
Declan Freeman-Gleason3:20 PM

You don't need a dedicated Talon, but the robot I was using didn't use that Talon for anything else.

Dana Batali
Dana Batali3:35 PM

yes, but was there another reason not to just add it to a follower motor? Is it true that you can't add it to the master motor if there's already an encoder there? Is the existing connector robust (as in, would you stake a match on it?)

Declan Freeman-Gleason
Declan Freeman-Gleason3:51 PM

There was no reason not to add it to a follower. I would stake a match on the connector because we already have relied upon it in matches (many times). The encoders use the same connector, which is good in regard to reliability, but also means that it cannot share a master motor with an encoder.

Dana Batali
Dana Batali4:16 PM

okay then! Next time we get a robot, we should rewire it to a follower and update the constants table.

2018-01-17
Justice James
Justice James2:03 PM

I can't make today's meeting, sorry. If someone could post a summary of what we did, that would be great.

Enrique Chee
Enrique Chee2:26 PM

Why ? Not very useful if you are not at our meetings.

Dana Batali
Dana Batali3:02 PM

Added Qs and As for those interested in the nitty gritty of robot code. Optional reading for new programmers, mandatory reading for mentors :simple_smile:. (thanks to declan for many of his answers!).

https://github.com/Spartronics4915/2018-POWERUP/blob/master/Learning.md

Randy Groves
Randy Groves5:06 PM

Get a 404 on that link.

Terry Shields
Terry Shields5:38 PM

Yeah, "page not found". Dana, can you update the learning link you listed above?

Dana Batali
Dana Batali6:33 PM

Since the file is checked in to the private repo it gives the 404 error (since, i guess you aren't auto-logging into github or don't have access to the private repo).

Dana Batali
Dana Batali6:47 PM

and by mentors, i meant programming mentors. Terry if you are interested I can email you a .pdf, The other option is to get you set up as a github contributor.. Contact declan, ronan or noah for this.

2018-01-18
Terry Shields
Terry Shields8:50 AM

@Dana Batali: Yes, that link works perfectly! Thanks Dana

Dana Batali
Dana Batali10:28 AM

@Declan Freeman-Gleason: i untied the first knot in our startup woes... The dreaded file: robotCommand can't have any comments in it because it's not a a shell script. Rather its a file that's parsed by frcRunRobot.sh. (frcRunRobot.sh was trying to invoke the program # with args YOU CANNOT HAVE etc). Having gotten past that issue, I'm now merely seeing crashes.. :wink:

Dana Batali
Dana Batali11:33 AM

Dana Batali
Dana Batali11:45 AM

crashes are now resolved... Driverstation issues now resolved on my windows laptop. Turns out it was a firewall issue - this prevents inbound connections and these are the ones that establish robot connection lights. I went to the app "Windows Defender Firewall with Advanced Safety" Looked at Monitoring/Firewall and found that FRC Driver Station inbound connections were disabled. Wen to Inbound rules, made the fix and voila!

Declan Freeman-Gleason
Declan Freeman-Gleason1:13 PM

Lots of great troubleshooting! I feel much better about this now. If you have time, could you test if the `WPI_TalonSRX.setInverted()` stack overflow is actually the result of a bug in CTRE's code? I would try calling it in an empty project created by the WPI project wizard.

Dana Batali
Dana Batali1:46 PM

@Declan Freeman-Gleason - here's what i'm seeing:

https://github.com/CrossTheRoadElec/Phoenix-frc-lib/blob/master/java/src/com/ctre/phoenix/motorcontrol/can/WPITalonSRX.java

implements SpeedController, but that is now an interface, not an implementation.

https://github.com/wpilibsuite/allwpilib/blob/master/wpilibj/src/main/java/edu/wpi/first/wpilibj/SpeedController.java

It also extends TalonSRX, which in turn extends BaseMotorController. Here, is where there is a defined method.

https://github.com/CrossTheRoadElec/Phoenix-frc-lib/blob/master/java/src/com/ctre/phoenix/motorcontrol/can/BaseMotorController.java

So we're now in the details of java class composition. Seems to me setInverted is implemented by super (here being BaseMotorController) and it's our job to either override the superclass implementation or to implement the interface (which conflict).

So, I guess I need to understand more how to repro the case, even in our code base. Can you point to a line of code?

Finally - i think we should rename CANTalon (since it conflicts with the class in ctre's lib). I have a few name ideas: BHS_TalonSRX or TalonPhoenix or Talon4915. Thoughts, concerns?

Actually finally: please read the section on motor safety in Learning.md. We should discuss if the value of motor safety and SpeedController inheritance is worth the cost. Obviously we could move the inverted implementation into Talon4915...

Dana Batali
Dana Batali4:11 PM

hm. on the testbed the Talon SRX status is reported as inconsistent... Perhaps someone can try to run this down via Chief Delphi?

(and yes, when I look at the silverlight page, the firmware is the same on the four motor controllers... perhaps what's inconsistent is that only two motors are actually attached)

Dana Batali
Dana Batali4:20 PM

pop quiz: anyone know how to generate this report? and what it's a report of?

Declan Freeman-Gleason
Declan Freeman-Gleason4:23 PM

@Dana Batali There's a self-test button on the silverlight page when you select a Talon, next to the save and refresh buttons.

Dana Batali
Dana Batali4:25 PM

@Declan Freeman-Gleason: wins a github sticker. For the rest of the team: you can inspect the roborio when you are connected to its radio. Only gotcha is that it requires the microsoft silverlight plugin which has been deprecated by most browsers. AFAIK, you need internet explorer, or an older build of firefox to get to this page. Just point a compliant browser to http://10.49.15.2

Dana Batali
Dana Batali4:34 PM

Noticed that the single parameter variant of set() in WPI_TalonSRX, forces the control mode to PercentOutput. This strikes me as a bogus translation, and one that we might need to be very careful (not) to employ.

Dana Batali
Dana Batali5:08 PM

Also noticed that the Robot logs are busted in the dashboard.

2018-01-19
Dana Batali
Dana Batali8:28 AM

@Darwin Clark: ideas for vision tasks tonight:

1. identify a person in cad, perhaps @Kenneth Wiersema, to discuss vision requirements with. Perhaps a member from #strategy should join in that conversation. If there is general agreement that the highest priority vision requirement is to identify and approach-quickly cubes, then the next step is to work with CAD to "reserve" space on the robot. The wooden-mount that we currently have for the raspi cam might either be used directly (with the aid of an L bracket) or inspire someone on engineering to build a similar thing. We also need to consider whether a range finder would be a valuable sensor. @Peter Hall said with have 4 of the sonar sensors from the KOP, I imagine we'd mount the sonar sensors as well as illumination sources directly to the vision mount (following the example of the limelight product).

2. continue work on finding raspi settings to accentuate yellow blocks. I was imagining a combination of contrast, exposure and saturation. Did you try combining these effects last time?

3. familiarize yourself with the IP camera in the vision box and its interop with the driverstation. Using mjpeg, we can show the current latencies to members of the driving team for discussion. If we need lower latencies, we may want to explore raspi to deliver mjpg streams to the driver station. There's a thread above between declan and chrisrin on this topic.

4. continue to experiment with network tables... If we plug the raspi into the testbed network, there's likely a way to measure latency between raspi and robot code. This would be good to know.

Dana Batali
Dana Batali8:29 AM

@Declan Freeman-Gleason - i'll be looking into the setInverted problem today, fyi.

Declan Freeman-Gleason
Declan Freeman-Gleason8:33 AM

Ok, thanks!

Dana Batali
Dana Batali8:56 AM

- I see that there is a wpi update available... Probably a good idea for everyone to perform the update first thing today.

Binnur Alkazily
Binnur Alkazily9:00 AM

^^ I say it is a mandatory update - they have been working on various fixes. Let’s make sure we have the latest (and hope it won’t screw up what is working...)

Dana Batali
Dana Batali9:02 AM

@Binnur Alkazily - i've been looking for release notes, if you find them please post a link?

Dana Batali
Dana Batali9:03 AM

from chief delphi - an alternate library/interface for pathfinding:

https://github.com/JacisNonsense/Pathfinder

Dana Batali
Dana Batali9:03 AM

only interesting to the pathfinders among us (they know who they are)

Binnur Alkazily
Binnur Alkazily9:09 AM

@Dana Batali not sure where they keep release notes -- but here is what is closed in this version
https://github.com/wpilibsuite/allwpilib/milestone/14?closed=1

Binnur Alkazily
Binnur Alkazily9:11 AM

I have been watching the discussions to see if they have fixed the DS disconnect issue -- doesn't look like it. but, curious if #875 improved our experience

Darwin Clark
Darwin Clark9:16 AM

@Dana Batali Sometime today, I'll try to throw up a priority list in terms of the uses of vision. I.E: Assuming everything goes correctly, what do we want to be able to do first, then if we complete that, what do we want to do and so on. I have not thought about the range finder as a viable sensor, that would be something I would have to think through, and I imagine that it is fairly low on the priority list. I'll educate myself on the latency problem and how it fits into vision. In regards to the latency question, I believe we actually did something like that at one point, but it was fairly experimental, and so needed to be further refinement.

Dana Batali
Dana Batali9:21 AM

excellent!

Some considerations:

if we can identify a yellow block, we'd like to automate picking it up. In order to do that we need to point to it, then move toward it, then activate the grabber. We might get a reasonable estimate of distance-to-block by measuring its size onscreen. If that turns out to be too inaccurate, a rangefinder (sonar, lidar, etc) may be our only solution. In other words, for this task, it might turn out to be fundamental to success (and so, perhaps, not a low priority).

regarding latency: it's merely the measure of time between when something is sensed and when that result can be acted upon. We would hope that the lag between generating a target on the raspi and it reaching the control loop of robot code would be on the order of 10-20 milliseconds (preferably less). First step is merely to measure what we have.

Darwin Clark
Darwin Clark9:31 AM

I feel like the example that you gave (Using a test chassis and simply following the cube) would be quite an important milestone. I'll do the latency test tonight for sure.

Declan Freeman-Gleason
Declan Freeman-Gleason10:03 AM

@Dana Batali We really should update the library inside the repository. User-directory WPILib updates shouldn't have any effect.

Dana Batali
Dana Batali10:04 AM

@Declan Freeman-Gleason - yes I'm on that task today

Terry Shields
Terry Shields10:41 AM

Posting this question in #programming and #intake-grabber ...Besides vision capability, has anyone seen other teams provide the driver with a "successful capture" indication when the robot clutches a cube?

Darwin Clark
Darwin Clark2:12 PM

@Terry Shields I have not seen something like that for this year's game, however @Dana Batali was explaining a team who had pulled off something like this last year with gears. The team had bound a button on their driver station that was responsible for auto-placing of a cube on the prong/extrusion. The cube situation this year would be a similar system.

Binnur Alkazily
Binnur Alkazily3:31 PM

@Declan Freeman-Gleason fyi -- upcoming release may fix the disconnect issues you noticed: https://github.com/wpilibsuite/allwpilib/issues/889

Dana Batali
Dana Batali3:33 PM

@Binnur Alkazily - I've been running that release on the testbed today and still see occasional driverstation disconnect messages. Haven't noticed negative effects associated with this message, but more testing is certainly called for. There are other scenarios where we see this message and they have been tracked down to crashing robot code, etc. So... two steps forward, one step back, ...

Binnur Alkazily
Binnur Alkazily3:35 PM

I don't think 889 fix made into 2018.2.1 release -- need to look for 2018.2.2 update

Binnur Alkazily
Binnur Alkazily3:36 PM

scratch that, it will be 2018.3.1 (not 2.2)

2018-01-20
Binnur Alkazily
Binnur Alkazily12:30 AM

speaking of the CTRE...
>We have also decided to compile/release a version of last season's CTRE v4 Toolsuite that is compatible with the 2018 roboRIO image, due to the response of teams who are not comfortable porting/updating their various software components to the newer and back-breaking Phoenix v5.
https://www.chiefdelphi.com/forums/showthread.php?t=161407

Chris Rininger
Chris Rininger12:07 PM

Was just chatting with Kenneth in Engineering channel, and he mentioned the camera for vision will not be on the lift. I believe to enable the drive team to aim cube placement on the scale and achieve greater precision (& placement batting average) than a low shooter, there will need to be a camera on the lift. I'm raising it now for awareness - it seems very likely we'll need two cameras for these divergent objectives.

Dana Batali
Dana Batali12:18 PM

an alternate solution you proposed was to have a birdseye view cam on the driver station. Has this idea been rejected?

Dana Batali
Dana Batali1:49 PM

voltage ramp rate for CANTalon from CD:

We had a similar problem to this, but if you held down the joystick for long enough, it would accelerate.
We were doing configOpenloopRamp(60, 0) on each of our motors, and the 60 used to represent the amount of volts the motors would ramp up each second. But it was changed to be the amount of seconds the motor takes to go from 0 volts to max, so our motors were accelerating very slowly.
Changing the 60 to 0.5 solved our problem

Dana Batali
Dana Batali1:54 PM

CANTalon firmware 3.3, are we currently at 3.1?

http://www.ctr-electronics.com/talon-srx.html#producttabstec

Dana Batali
Dana Batali3:02 PM

sadly we've already broken a few backs, so I'm guessing we'll proceed happily into the future :slightlysmilingface:

Dana Batali
Dana Batali3:28 PM

perhaps we can get @Brian Hutchison to dabble with this on the testbed?

Declan Freeman-Gleason
Declan Freeman-Gleason3:49 PM

Yes, there was a recent update that re-added motion profiling. We should probably go ahead and update the Talons.

Dana Batali
Dana Batali4:17 PM

@Declan Freeman-Gleason: i'm looking at config for Talon4915... Is it possible you are too?

Dana Batali
Dana Batali4:40 PM

- yippee! i see that wpilib 18.2.2 is now available. Everyone please update. (and yes I'll check in changes to git too)
(now that I look at it, it appears trivial - until we need access to gameSpecificMessage).. So the fix for the robot connection bug isn't there.

Binnur Alkazily
Binnur Alkazily4:42 PM

Ha! One of these days I will figure out their versioning model :) thanks for the heads up!

Binnur Alkazily
Binnur Alkazily4:47 PM

Yea! And it wasn’t a suggestion that we would change... was chatting w/ Declan on how they annoyed the community w/ all work and no value, so it was more of sharing how they tried to fix it.... poorly

Chris Rininger
Chris Rininger7:39 PM

Good thought, but that was outlawed in 2016 when many teams decided do it to mitigate poor Stronghold game sight lines - max height of driver station has been 6'6" off the carpet since then. How about a quasi-birdseye view on the robot? Pieces: motor-driven fatmax tape measure connected to telescoping camera pole that is lower than 55" at shortest. Possible pole: https://www.amazon.com/Koolertron-Lightweight-Professional-Shotgun-Microphones/dp/B01MF9B055/ref=sr13?ie=UTF8&qid=1516505309&sr=8-3&keywords=telescoping%2Bcamera%2Bpole&th=1 Raise & lower as needed. Or we could put a camera up on the lift :slightlysmilingface: and use coil usb cable for power/data: https://www.amazon.com/Coiled-Connector-Synchronize-Charging-Extension/dp/B0741PJYZM/ref=sr12?ie=UTF8&qid=1516505920&sr=8-2&keywords=usb+coil

Chris Rininger
Chris Rininger9:16 PM

Helpful pictures: driver station perspective: https://www.chiefdelphi.com/media/photos/45675

Chris Rininger
Chris Rininger10:35 PM

Helpful screen shot from Synthesis simulator: https://1drv.ms/i/s!AkoZXdKPoojXjqN6UuFUPzRoafjQ. My observation is a camera on the robot helps a lot, but it needs to be quite high, looking down a bit, in order to avoid cubes in front obscuring cubes behind. If we could find a way to have a really high camera, it could help us outperform other high placers - seems clear to me.

2018-01-21
Binnur Alkazily
Binnur Alkazily2:11 PM

Quick summary and next steps for the driver camera discussion -- electronics team will wire an IP camera (with a switch) to the robot EOD. We'll need to make sure some temporary mount for the camera on the robot (no dangly cameras allowed this year!!). This should give us the 1st step towards driving robot with camera.
We also discussed using raspberry pi connected camera as next step. However, wiring, etc. for that would be longer. TBD
@Declan Freeman-Gleason @Dana Batali @Darwin Clark @Chris Rininger ^^^

Chris Rininger
Chris Rininger6:35 PM

Thanks, just to confirm is this the lower camera? We also talked about a camera on the lift (higher than the lift actually, at the back so it can look down on the scale) for aiming cube placement - latency less critical for this camera - can be USB & probably should be since there are coiled USB cables that would carry both power and data.

Binnur Alkazily
Binnur Alkazily7:03 PM

It is a camera mounted to the current robot chassis to help evaluate camera performance relating to driving needs. It is not tagged for any specific purpose beyond that. Any learnings from this will need to identify the 'next steps' for requirements relating to camera needs. Ex. assuming one actually had a scissor lift, one could assume to also mount this camera in a position to evaluate usefulness of cube placements...

Binnur Alkazily
Binnur Alkazily8:34 PM

ha! Dana & Declan -- checkout https://github.com/CrossTheRoadElec/Phoenix-frc-lib/issues/22
it explains our strange behavior w/ follower not being inverted as its master...

2018-01-22
Dana Batali
Dana Batali8:48 AM

wow - this sound like it! great detective work! Amidst all the chaos yesterday, I'm not exactly certain what got tested. I know we discussed this possibility, but I'm not sure if @Declan Freeman-Gleason deployed the test or what the results were. On your other points, i was relatively certain that the nominal output was set to 0 for manual drive. I'm not sure what the brake situation is/was.. Declan: as we briefly touched upon, I'll be making some additional cleanups to Talon4915 to offer support to subsystems to configure/init params without editing generalConfig. We should only modify that if we truly want to establish changes across all motors for all subsystems.

Dana Batali
Dana Batali10:48 AM

looking at the code here:
https://github.com/dbadb/2018-POWERUP/blob/master/src/com/spartronics4915/lib/util/drivers/CANTalonFactory.java, line 32.

I see that we were applying the invert to the slave motor. My reading of Issue#22 is that this is now correct (and may not have been earlier). It's possible we should play with both settings at our earliest convenience.

Dana Batali
Dana Batali10:49 AM

ie: it's possible the new CTRE update changed this behavior since the right motors were certainly acting "wonky" iirc.

Chris Rininger
Chris Rininger12:31 PM

@Declan Freeman-Gleason FYI Jon said he talked with this year's primary driver (to be announced soon) & this is the priority order for enabling driving controls: 1) single joystick - arcade drive, 2) Xbox controller - split axis, 3) I'd like to also enable the two joystick split axis as a third option. By split axis I mean left stick = forward/backward and the right stick left/right. If it's easy, go ahead and also enable the alternative xbox setup we had with Helios (with forward/backward = triggers). If the number of options is limited though, please enable them in the priority order I just stated. Thanks!

Dana Batali
Dana Batali2:41 PM

@Binnur Alkazily @Declan Freeman-Gleason :
I was inspired by something Binnur said yesterday: we should factor the general cantalon code of the drivetrain into a new reusable class.

I copied the CANTalon code from Drive.java into a new class CANTalon4915Drive.java. I then created a new variation of Drive.java (called Drive2.java) with an instance of the new class and many fewer lines of code.

My sense is that this exercise was worth the trouble and I'd advocate for migrating to this approach.

Here's what it looks like.

https://github.com/dbadb/2018-POWERUP/commit/7eb522386c60b58156e401ce7f48f42de9668b3d

Comments? Thoughts? Soul crushing critiques?

Dana Batali
Dana Batali2:46 PM

(and btw: i guessed which motor the imu was hanging off since that change only resides in declan's branch)

Dana Batali
Dana Batali2:51 PM

and related question: did we upgrade the Talon's to firmware 3.3?

Darwin Clark
Darwin Clark4:34 PM

@Chris Rininger @Dana Batali I have a few questions in regard to the priorities in regard to the camera for driving. I understand that the camera that we showed Chris on the testbed was low latency (Perhaps enough for Chris' standards?) and I'm wondering where to proceed from here. If I understood what I heard, then there was also a latency problem last year, where the latency at the shop was fine, but was much, much worse at the competition. I can start there, if that is what you think is necessary. I can also look at using the video feed from the PI instead of an IP camera, if that is a priority. What do you guys think?

Dana Batali
Dana Batali4:42 PM

@Darwin Clark: thanks for checking! From my point of view regarding data from this year:

1. i understood from @Chris Rininger that last year's robot camera seemed very laggy during driving practice. Much more so than what he saw on our driver station last night.

2. if the new camera is mounted on the robot AND the drivetrain is working (we're in the middle of CANTalon headaches), we could simply wait for feedback from the driving team.

3. in the meantime, I thought it would be useful to compare the "bad" experience camera with the "good" experience camera, if only to help with prioritization. If we could reproduce bad performance with last year's camera, then we should simply wait for driver feedback with the shiny new camera. If we saw good performance with last year's camera on the testbed, then we'd be in more mysterious conditions. And perhaps that would suggest further 'research'.

Since we don't have the #3 datapoint, I imagine its most efficient to wait for #2.

Binnur Alkazily
Binnur Alkazily4:59 PM

@Dana Batali thanks for the dive into how we can further organize the cantalon for drivetrain. will check it out later.
I don't know when I'll be in on Friday -- but sent thoughts on config changes to both you and declan on private slack message. I think these (ramp rate & nominal output) should be tried first to see how the robot behavior changes. I feel they should positively improve driving.

Binnur Alkazily
Binnur Alkazily5:02 PM

wondering about testing w/ the PI as well -- especially given driving team wants two cameras to help w/ placement. @Dana Batali thoughts?

Dana Batali
Dana Batali6:04 PM

@Binnur Alkazily: if what you mean is investigate improvement to frame rate & latency, then certainly raspi offers some avenues for improvement (more knobs, more investigations). On the other hand, if the ip cam is "good enough", then it might be more robust to go with that approach. The one that was demoed last night was $20 (so cheaper than a raspi) and likely to be more robust under power failures (eg: their file system doesn't risk corruption?). If we can avoid doing the work to make raspi serve as a webcam, we should - that's probably some hours of unixy work. Waiting initial reactions to the current ip cam, seems like the efficient path but again we can switch on a dime if need be.

Binnur Alkazily
Binnur Alkazily6:09 PM

My thought was that if we need to mount and stream two cameras, what works for 1 IP camera may not work -- however, easiest thing to do is to mount 2 cameras and see the quality

Dana Batali
Dana Batali8:15 PM

fyi: we have the tech to stream any number of ip cameras or raspis limited only by the ethernet switch. Tech-wise, this amounts to "pulling" data from the camera that the drivers wish to see. If they need to see more than one at a time, then we'll likely run into bandwidth issues. Of course, with each new onboard device the mean-time-between-failure goes up.

Dana Batali
Dana Batali8:16 PM

In other words: the dashboard is configured with an number of urls and can trigger or be triggered to switch from one cam to another. This is how last years' dual cam solution worked.

Chris Rininger
Chris Rininger8:19 PM

Thanks for the insights. It seems like we may soon be able to have an initial cross-team conversation about driver station dashboard & operator controls.

2018-01-23
Dana Batali
Dana Batali7:51 AM

release notes from phoenix 5.2.1 (jan 11, 2018)


Phoenix Framework 5.2.1 Installer (Jan 11 2018):
Class Libraries (FRC C++ 2018v12/
FRC Java 2018
v15/
FRC LabVIEW 2018v13/
FRC CCI 2018
v12): Individual Trajectory Point timeDur options changed.
0ms is now an option. 15ms is no longer an option.
Talon SRX/Victor SPX Firmware (3.3): Changed to accomodate new Trajectory Point timeDur options.
Phoenix Framework 5.2.0 Installer (Jan 10 2018):
Class Libraries (FRC C++ 2018v11/
FRC Java 2018
v14/
FRC LabVIEW 2018v12): Motion Profiling support added.
: Added getClosedLoopTarget
Class Library (FRC CCI 2018
v12): Added motion profiling support.
Class Library (FRC CCI 2018v12): Fixed issue when setting deadband where value was doubled.
Class Library (FRC Java 2018
v14): Fixed issue where getStickyFaults was returning normal Faults.
Class Library (FRC LabVIEW 2018_v12): Fixed issue where getGeneralStatus caused code crash.
Talon SRX/Victor SPX Firmware (3.2): Motion Profiling now usable.

Declan Freeman-Gleason
Declan Freeman-Gleason9:03 AM

@Dana Batali If you look at my fork (you have read access because forks inherit permissions from upstream) all my changes are checked in. One of the last things we did was to correctly set the follower, which cleared up a lot of weirdness. Brake mode is also enabled. There's still a feeling of lag when driving the robot which I can't account for.

Dana Batali
Dana Batali10:08 AM

@Declan Freeman-Gleason - just got the robot and determined that we had motor ids wrong (?). Do these correct values match your branch?

public static final int kLeftDriveMasterId = 1;
public static final int kLeftDriveSlaveId = 2;
public static final int kRightDriveMasterId = 3;
public static final int kRightDriveSlaveId = 4;

Declan Freeman-Gleason
Declan Freeman-Gleason10:12 AM

I think they do. My latest code is in my fork, so you should check if my memory is correct.

Declan Freeman-Gleason
Declan Freeman-Gleason10:16 AM

Oops, I forgot to push to my fork. I'll do that when I get home.

Dana Batali
Dana Batali10:17 AM

k - i'll keep you posted, no worries. Best of luck on finals!

Declan Freeman-Gleason
Declan Freeman-Gleason4:00 PM

@Dana Batali My changes have been pushed to my branch. https://github.com/pietroglyph/2018-POWERUP/

Dana Batali
Dana Batali4:56 PM

@Declan Freeman-Gleason - only if you are bored studying for finals :wink:

I just pushed some functional code to my repo. I've been using the robot test mode to ensure that things are behaving correctly. Currently i'm seeing some discrepancies between left and right side of the drivetrain (in current consumption). Seems like one side might have more friction, but I'll defer to engineering on that (including @Enrique Chee).

I haven't yet integrated your changes but will endeavor to do so, perhaps manually, since so much else has changed.

Enrique Chee
Enrique Chee5:02 PM

It is possible the students did not install the drivetrain correctly. They were supposed to be tested before they finish attaching gearbox to belts and pulleys. @Will Hobbs @Robert Galvin @Harper Nalley

Enrique Chee
Enrique Chee5:05 PM

Some in programming left a tape measure and a USB dongle (brand is EDIMAX) in Ms. Brown's room on Sun. Please see me if it belong to you.

Enrique Chee
Enrique Chee5:05 PM

Someone

Justice James
Justice James5:05 PM

@Enrique Chee I think that's @Declan Freeman-Gleason 's

Declan Freeman-Gleason
Declan Freeman-Gleason5:23 PM

@Enrique Chee The USB dongle belongs to me. I missed those when I swept the room on Sunday. I will be more systematic in the future.

2018-01-24
Dana Batali
Dana Batali8:43 AM

@Declan Freeman-Gleason - i believe i pulled and integrated your changes, then pushed these to my repo. In theory you should be able to pull with no conflicts into your repo. I'll be doing more work today, so pull at your leisure.

Binnur Alkazily
Binnur Alkazily9:24 PM

@Declan Freeman-Gleason @Ronan Bennett @Noah Martin Cruz posted the scissor lift design on engineering channel — it is pinned there. Pls keep discussion to its thread. Thanks and looking forward to seeing our scissor state diagram on friday :slightlysmilingface:

Chris Rininger
Chris Rininger10:51 PM

I was thinking about lining up to climb & a possible sensor / programming opportunity. Our current intention is to have two arms extend out the front of the robot to brace against the sides of the 17" wide tower. The arms will need to be a bit wider than 17" in order for the driver to be able to get into the right place for them to straddle the tower. Would it be possible for the driver to get the robot close and then run an auto sequence that utilized one or more sensors to detect the tower edges and line up automatically? I know the Dragons (FRC 1595) do this kind of thing - talked to them last year - they automated both gear pickup and gear drop off I believe, implemented as operator invoked routines.

2018-01-25
Binnur Alkazily
Binnur Alkazily7:42 AM

Chris - anything is possible given time. We have some challenges such as ease of positioning ourselves over the ramp. I suggest starting w/ using teleop to see what is needed for that alignment before diving into specific details on sensor positions.

2018-01-26
Josh Goguen
Josh Goguen7:09 PM

:scissors:Scissor Squad:scissors:

So these are the general ideas I got from our discussion. Our scissor lift will have four known heights that will be wanted states: Low, Switch, Scale, and Climb. These wanted states will be known potentiometer values. Ex: Low=10, Scale=30, etc.. While the robot is in these states, they are also system states, but otherwise, it must go through other system states to achieve these wanted states. These system states could be: Filling, Venting, Holding, etc.. Our robot will be able to handle these system states in the code to achieve the wanted states efficiently. We also still need to account for the Jog wanted state, which I do not understand well enough yet to talk about. So overall, that is what I got from our discussion, please correct me on anything that is incorrect, and of course everything is still up to discussion. Thanks.

:scissors:Scissor Squad Out:scissors: - Josh, Corey

Binnur Alkazily
Binnur Alkazily8:21 PM

@Martin Vroom @Justice James

Ryan Olney
Ryan Olney8:24 PM

Binnur Alkazily
Binnur Alkazily9:26 PM

@Ryan Olney @Austin Smith @Martin Vroom @Justice James please fill in the details of your subsystem similar to @Josh Goguen’s explanation above

Binnur Alkazily
Binnur Alkazily9:37 PM

team - also please pay attention to engineering channel — all the latest greatest is shared over there in terms of the robot. fyi

Ryan Olney
Ryan Olney9:55 PM

Current Harvester states and functions:
We would so far like to have open and closed wanted states where the harvesting arms will be either out preparing for harvesting or shut to make more room respecivly both with no motors running. We would also have wanted states where we would have motors running forward while the harvesting arms were closing to take in cubes, one where we would have the same scenario, but with the motors running backwards to spit out cubes, and one to just hold onto cubes once they have been harvested, with no motors on. Finally we had also decided that there would be a forbidden state that is an unwanted state, called emergency, this is the state where the harvesting arms are closed in, and the motors are running. We should never be in this state, but should we, we will transfer to being in a closed state. This is pretty much what we discussed about this during the meeting today, but feel free to add something I might be forgetting to add, or got wrong/misunderstood.

2018-01-28
Austin Smith
Austin Smith11:14 AM

@Declan Freeman-Gleason @Ryan Olney Just wanted to let you know that I will have to leave at 3;15 tday and will not be able to return. I have a recital that has been on the books fir about 6 months, and I can guarantee that I will never have to leave early from a meeting for anything to do with singing or recitals.

Binnur Alkazily
Binnur Alkazily12:51 PM

Team - I am running late (needed to wrap up work stuff...) should be in by 1:30.

Ted Larson Freeman
Ted Larson Freeman1:41 PM

@Ted Larson Freeman has joined the channel

Riyadth Al-Kazily
Riyadth Al-Kazily2:34 PM

Binnur Alkazily
Binnur Alkazily6:42 PM

@Declan Freeman-Gleason @Dana Batali slow-mo autonomous 90-degree turn

Dana Batali
Dana Batali8:14 PM

:fireworks: :rocket: :+1: :poultry_leg:

2018-01-30
Declan Freeman-Gleason
Declan Freeman-Gleason12:05 PM

Good resource for understanding how to use PID: http://www.wescottdesign.com/articles/pid/pidWithoutAPhd.pdf

2018-02-01
Darwin Clark
Darwin Clark8:22 AM

@Dana Batali I will have your Jetson at the meeting to trade.

Dana Batali
Dana Batali9:41 AM

Ye programmers of robots, here is a link to Team 254's robotPeriodic loop. I forward this for you all to scan, it gives a good idea of the way they decided to ensure that their robot-with-many pieces acts in an well defined manner:

https://gist.github.com/dbadb/07c281956c1802e0505d841b0782637c

Intake/Harvester folks: please spelunk for the prioritization of states related to exhaust. Note that the wanted states for their intake are similar to ours.
Everyone should take note of how driver buttons control the behavior (but in a nested fashion). For example, if we're in aim mode, none of the buttons associated with intake are listened to.

Dana Batali
Dana Batali11:42 AM

all: here's a work-in-progress set of programming guidelines. Its main purpose is to make sure we're all on the same page when it comes to logging and smartdashboard traffic. Please read and ask questions:

https://github.com/dbadb/2018-POWERUP/blob/master/BestPractices.md

Binnur Alkazily
Binnur Alkazily4:31 PM

@Dana Batali @Declan Freeman-Gleason @Ronan Bennett @Noah Martin unfortunately I will be on the 5:45 :/ see you sometime this pm.

Dana Batali
Dana Batali4:57 PM

@Binnur Alkazily stay dry! (thanks for the update)

Justice James
Justice James8:47 PM

@Martin Vroom @Adam Rideout Redeker The ArticulatedGrabber.java code is pretty much finished, all that's left is to test the potentiometer in order to set the ranges, but I don't know when that can happen. If you want the latest code, pull from my fork.

Martin Vroom
Martin Vroom8:47 PM

Nice!

Justice James
Justice James8:51 PM

Thanks!

Darwin Clark
Darwin Clark9:17 PM

@Binnur Alkazily @Dana Batali @Declan Freeman-Gleason >:(

Darwin Clark
Darwin Clark10:16 PM

Yeah, that point is under the 'balls in the air' section of the project. There is a LOT of things that will be done to make this more and more deployable, and that is one of them, there is a lot of code optimisation that will go on, print statements that will be removed, ect. ect. In addition, Dana and Riyadth are going to do research on how to A) Robustify the Roborio (Such that when power is cut we don't run the risk of corruption) and B) Force the RIO to start in some form of vision mode where the pipline is running from the get-go, and other factors are put into place to be used right "out of the box" on the field. There is not offical list of things to do, but those are to name a few. Thoughts?

Riyadth Al-Kazily
Riyadth Al-Kazily10:21 PM

Here are a couple of starting points to look into "hardening" a Raspberry Pi to withstand sudden shutdowns:
https://www.chiefdelphi.com/forums/showthread.php?t=160630
https://www.chiefdelphi.com/forums/showthread.php?t=143690

Riyadth Al-Kazily
Riyadth Al-Kazily10:22 PM

Most creative solution I saw: Have RoboRio issue a shutdown command to the Raspberry Pi when there are 20 seconds remaining in the match. Seems a little fragile, but it certainly could work.

Declan Freeman-Gleason
Declan Freeman-Gleason10:24 PM

If we do the latter, we probably still want the former.

Riyadth Al-Kazily
Riyadth Al-Kazily10:24 PM

@Darwin Clark Does the Raspberry Pi need to run X windows while on the robot? The Adafruit link in the first link indicates the read only filesystem doesn't work with X...

Darwin Clark
Darwin Clark10:26 PM

Nope.

Darwin Clark
Darwin Clark10:28 PM

Which is also something that will be optimised on the pipeline, to not even consider needing X windows when we run, and only show us windows when we parse a '--debug' argument or something similar.

2018-02-02
Dana Batali
Dana Batali8:58 AM

@Darwin Clark, @Riyadth Al-Kazily: to be clear, during reboot of the pi we don't start the x window server. When Darwin does interactive work, he logs into the console and then types startx. Then he opens a terminal window and starts the python script from the command line. That script can optionally open a display window (via --display 0/1 iirc). And as for starting up the python script in "headless" mode. There are two options: 1) we can create an "init script" that will launch the python script (in 'daemon' mode) every time we startup. 2) we can add a button to the dashboard that starts and restarts the visioni process. Opton #2 may only be necessary if we find that the init sequence is "too early" to grab the camera (a problem we had on jetson with the usb camera).

Binnur Alkazily
Binnur Alkazily8:45 PM

seriously! I can’t believe no one from programming voted on this w/ emojis! :slightlysmilingface:
https://spartronics.slack.com/files/Nora Wilson/F94CC25M5/sketch1517625039497.jpg

Binnur Alkazily
Binnur Alkazily8:45 PM

Binnur Alkazily
Binnur Alkazily8:46 PM

^^ in the random channel

Justice James
Justice James11:27 PM

I found this helpful website for Git https://smessina.com/gitflow#/

Justice James
Justice James11:28 PM

It seems to have all the commands that we are using.

2018-02-03
Declan Freeman-Gleason
Declan Freeman-Gleason12:19 PM

The two large (green) carpets are 15 feet wide by 40 feet long. The small grey carpet 10x15 feet.

Declan Freeman-Gleason
Declan Freeman-Gleason12:19 PM

@Declan Freeman-Gleason pinned a message to this channel.

Josh Goguen
Josh Goguen1:03 PM

I will be a few minutes late but I am on my way, sorry

Binnur Alkazily
Binnur Alkazily2:59 PM

Sean Lafer
Sean Lafer3:06 PM

@Sean Lafer has joined the channel

Enrique Chee
Enrique Chee10:16 PM

Great job !!!

2018-02-04
Dana Batali
Dana Batali11:51 AM

@Declan Freeman-Gleason - great job! Concern here is time - movie is 14 sec long (minus 2-3 to start)... Cutting it close to get one cube onto the scale. Love the way it takes the first turn!!

Dana Batali
Dana Batali11:54 AM

: programmers please be sure to push your latest state up to your repos. Coming soon: we'll be asking teams to "pitch" their current code to at least one onsider mentor and student as a way to "spread the knowledge" and as a way to provide constructive feedback.

Declan Freeman-Gleason
Declan Freeman-Gleason12:03 PM

I think that the last part can easily be expidited... I was seeing if I could swing into the correct angle without turning in place, but it seems like that's too slow. We should be able to cut off at least 3-4 seconds if we go in faster and just do a turn in place. The closeness to the scale seems wrong, so I'm further examining the accuracy of the path, and I would like to re-measure the field at some point.

Enrique Chee
Enrique Chee12:15 PM

Make sure we have Auto to the switch @Declan Freeman-Gleason

Dana Batali
Dana Batali12:22 PM

that's easy peasy

Dana Batali
Dana Batali12:23 PM

(as long as we have a scissor lift and grabber to play with)

Enrique Chee
Enrique Chee12:31 PM

Great !

Binnur Alkazily
Binnur Alkazily12:44 PM

Let’s plan to remeasure sooner than later — todo for Wednesday

Darwin Clark
Darwin Clark1:21 PM

@Declan Freeman-Gleason Do you know if going in front of the switch, instead of behind it (as you did in your video) is quicker?

Riyadth Al-Kazily
Riyadth Al-Kazily1:26 PM

The trick with going in front is that there are two other robots on your alliance which will also be driving in that space.

Darwin Clark
Darwin Clark1:27 PM

Remind me again: We only know the color of the switch and scale AFTER we put our robot down, correct?

Declan Freeman-Gleason
Declan Freeman-Gleason1:27 PM

Correct

Declan Freeman-Gleason
Declan Freeman-Gleason1:27 PM

Going in front/behind are the same distance, but going behind should take longer because of the need to slow down for the bump in the floor.

Declan Freeman-Gleason
Declan Freeman-Gleason1:28 PM

Still, going behind is better because you're probably not going to run into another robot.

Justice James
Justice James4:04 PM

@Declan Freeman-Gleason Where is the documentation for finding out what side of the switch/scale belongs to each team?

Ryan Olney
Ryan Olney7:38 PM

I just wanted to let you guys know that I won't be able to make the meeting tomorrow beacause I have already made plans and can't make it, but I'll try to catch up on anything I missed.

Justice James
Justice James7:39 PM

Is there a meeting tomorrow? I'm not seeing anything on the calendar.

2018-02-05
Darwin Clark
Darwin Clark9:40 AM

@Justice James , check the announcements channel.

Justice James
Justice James9:50 AM

@Darwin Clark Thanks

Justice James
Justice James10:56 AM

I also can't make the meeting because I've already got plans. I will be at every meeting from now up to mid-winter break, however.

Darwin Clark
Darwin Clark11:24 AM

In addition, I won't be at the meeting. My dad has a kidney stone, and so my mom needs to take him to a hospital in Seattle, and can not pick me up.

Declan Freeman-Gleason
Declan Freeman-Gleason4:17 PM

@Binnur Alkazily I just realized why this path was so close to the scale! Our measurements probably aren't off; the robot was placed on the wrong side of the alliance wall tape (I forgot that we had to change it).

Binnur Alkazily
Binnur Alkazily4:25 PM

Nice! I would be 100% OK with not have to measure again :smiley:

Binnur Alkazily
Binnur Alkazily4:34 PM

@channel ^^^ I would like to see each of the subsystem leads (whoever was maintaining the master subsystem code) to indicate that 1) they pushed the latest subsystem code; 2) location of the fork they pushed it to
This is CRITICAL so that mentors are looking at the latest code and not wasting time. Thanks!

Binnur Alkazily
Binnur Alkazily4:36 PM

Pls see @Dana Batali's request from yesterday. I would like to see each of the subsystem leads (whoever was maintaining the master subsystem code) to indicate that 1) they pushed the latest subsystem code; 2) location of the fork they pushed it to
This is CRITICAL so that mentors are looking at the latest code and NOT wasting time. Thanks!

Binnur Alkazily
Binnur Alkazily6:03 PM

Tip -- if you need to check if your thing is within a given range (such as a potentiometer value +/- some threshold), take a look at spartronics4915.lib.util.Util.epsilonEquals() method. It will make your code nice and easy to read.

Binnur Alkazily
Binnur Alkazily6:10 PM

@Josh Goguen @Cory_Houser see ^^^
In addition, here couple of observations, at least in the current fork I am looking at
- there is no potentiometer declared -- please use AnalogInput (similar to testbed work to date)
- need to make sure system state is updated in the onLoop by reading the potentiometer value before you check the wantedStates
- I don't see the code that will allow us to read and set the minimum, maximum, and switch position values -- these should be dev tuning values from SmartDashboard SmartDashboard.getValue("ScissorLift/Target1", defaultValue)
- I can't tell how we are testing the scissor lift -- when we have the HW, first thing the engr team will ask is for us to move it, move it... How are these wired up? autonomous? button presses?

Binnur Alkazily
Binnur Alkazily6:13 PM

And, you (applies to everyone!) are always welcome to point me at the right place and show me what I am missing :slightlysmilingface:
Please coordinate w/ @Declan Freeman-Gleason @Ronan Bennett @Noah Martin @Dana Batali as we should be ready w/ our code to test scissor lift by the end of wednesday (unless we get the HW earlier!!!)

Binnur Alkazily
Binnur Alkazily6:15 PM

@Dana Batali and others, anything I am missing, please add to this list. thanks!

Dana Batali
Dana Batali6:17 PM

and do check this out for more tips :slightlysmilingface:

https://github.com/Spartronics4915/2018-POWERUP/blob/master/BestPractices.md

Declan Freeman-Gleason
Declan Freeman-Gleason6:30 PM

@Josh Goguen @Cory_Houser I've been informed by the electronics team that there is also a small spring loaded cylinder that acts as a break. I think this got lost in communication, because I don't remember talking about it and didn't see it in your code (correct me if I'm wrong). Fortunately, it's an easy thing to add.

2018-02-06
Dana Batali
Dana Batali2:03 PM

@Declan Freeman-Gleason are there any details on this? is it just another single-sided solenoid? Should we apply the brake when we arrive at the target? Does this mean we now have > 8 solenoids?

Declan Freeman-Gleason
Declan Freeman-Gleason2:23 PM

Declan Freeman-Gleason
Declan Freeman-Gleason2:24 PM

This is the drawing I was looking at (from #electronic-pneumatics).

Dana Batali
Dana Batali2:49 PM

hm... this shows 3 solenoiod for scissorlift - which is what @Josh Goguen and @Cory_Houser were planning for (as far as I can tell).

Josh Goguen
Josh Goguen2:51 PM

We were planning for that, we have the solenoid in our code, we just have not used it anywhere yet @Dana Batali

Chris Rininger
Chris Rininger3:23 PM

IF we get to the climb with friends capabilities, there may be 2 to 4 more switchable releases (could be solenoids) needed for the drop-down forks/scaffolds (two sides). I see you already have the "climbing stabilization cylinder" which I assume means the arms dropped to stabilize against the tower. Depending on design, it might be the case that two individual releases are needed (due to spacing being two wide) rather than one double.

Chris Rininger
Chris Rininger3:23 PM

@Jack Chapman @Ethan Rininger Make sense? Agree?

Declan Freeman-Gleason
Declan Freeman-Gleason3:26 PM

@Chris Rininger That diagram was shared from #electronic-pneumatics, so I would direct any inquiry to that channel.

Riyadth Al-Kazily
Riyadth Al-Kazily4:34 PM

To clarify the brake solenoid: As the scissor lift rises to the point where the software wants to stop it, the software should first engage the brake solenoid, which will hold the scissor lift in place. The software may want to continue to allow air in the lift solenoid for a period of time afterwards, so that the scissor lift is pressurized firmly. If we turn off the lift solenoid too soon there is a chance that the lift will be "bouncy", and could go down a bit while positioning. The brake only keeps the scissor from rising higher, and does not prevent it from going lower.

Riyadth Al-Kazily
Riyadth Al-Kazily4:36 PM

After applying the brake, in order to move the scissor lift again, the brake must be released, and then the scissor lift must be lowered a small amount, even if the desire is to raise it up higher. This is because there is a mechanical cam that is under tension, holding the lift from going higher, until the lift is lowered slightly. Of course, if the lift needs to be lowered anyway, it will be free to do so after the brake is released (ie, no small motion required first).

Austin Smith
Austin Smith10:06 PM

Is there a set thing of what we’re doing tomorrow at the meeting? I want to be able to prepare.

Declan Freeman-Gleason
Declan Freeman-Gleason10:35 PM

@Austin Smith We want to get your code to a "running" (on the testbed, mostly working) state so that we can do a pull request and code review.

Austin Smith
Austin Smith10:35 PM

Alright. Thanks for letting me know.

2018-02-07
Dana Batali
Dana Batali11:14 AM

programmers: please make sure you have pulled the latest changes from upstream. Next, please make sure to push to your origin. I see, for example, that Austin's repo is way out of date. It won't surprise me that most of yours are? If you are confused by git (and who isn't?), please corner me or a progamming leader today and we'll get you in sync.

Dana Batali
Dana Batali12:30 PM

notes from my review of these respositories: Ryan (Harvester), Martin (ArticulatedGrabber):

- need to look at results of constructing motors and solenoids for initialization errors (ie motors not present)
for solenoids, there's a new method: `Util.validateSolenoid(solenoid)`.
for talons, `mMotor.isValid()`. If any of these aren't valid, you should log an error message like `logWarning("3rd solendoid is invalid") ` and `logInitialized(false)`;

- need to come up with names for motors, solenoids and DigitalIO pins (like kHarverterLimitSwith) and place this into Constants. You can refer to the pneumatics spreadsheet for numbers pinned to our channel. Ultimately @Ronan Bennett is responsible for guarding these values, but that shouldn't keep you from getting your code (and these names) into place. Idea is that we can change the number assignments with no impact on your code.

- need to update to latest upstream changes: `broadcast--()` methods are now `dashboardPut---()`

- need to make sure that you only invoke solenoid.set() when a state changes. Repeatedly calling this method
has the chance of "saturating the CAN bus".

- update wanted and current state to dashboard when they occur (via `dashboardPutState(state.asString())` and `dashboardPutWantedState(wstate.asString())`
- non-state-related updates to dashboardshould be made within the `outputToSmartDashboard()` method.

Dana Batali
Dana Batali1:42 PM

@Dana Batali pinned a message to this channel.

Noah Martin
Noah Martin4:12 PM

definitive articulator code for now

Ryan Olney
Ryan Olney4:13 PM

definitive harvester code https://github.com/RyanOlney/2018-POWERUP.git

2018-02-08
Dana Batali
Dana Batali9:03 AM

more comments on what is currently up there:

SystemState.articulatorPosition should be int, not double

acceptablePositionError should be int, not double

initial value of mWantedState should be PREPARE_EXCHANGE (?)

configureOutputPower, last number should be negative

you might rename mGrabber1,2 to mGrabber and mGrabberSetup

in onStart: we should initialize mSystemState to represent our understanding of the current state. articulatorPosition is easy. you can use mGrabber1,2.get() to determine the grabberOpen state.

* onLoop():
only call getAverageValue once.

int potvalue = ...getAverageValue()
... handleGrabberPosition(potValue);
handleGrabberState(potValue);

your articulator state change check will always fire, due to AnalogInput noise. perhaps you want to make a call to !epsilonEquals() with the allowedError.

Dana Batali
Dana Batali9:07 AM

on setWantedState, you should call smartDashboard.putWantedState(mWantedState.toString()); (now you can eliminate this one call from outputToSmartDashboard);

in stop() you should set solenoids to false (then update mSystemState)

Dana Batali
Dana Batali9:07 AM

zeroSensors is the correct place to auto-calibrate (if we can assume that the mechanism is in a rest pose)

Dana Batali
Dana Batali9:08 AM

that's all for now :nerd_face:

Dana Batali
Dana Batali9:12 AM

one last thing:

after these changes you are ready to make a pull request. (don't forget to run source->format)

Dana Batali
Dana Batali9:45 AM

comments on the latest version:

generally this looks very clean :+1:

we need a system state for any potential state. I think this means that we need one called UNKNOWN or DISABLED. This should be the default system state (line 51). Now on line 103, we must be careful (since onStart may be called during autoInit and periodicInit). There we could say:

if(mSystemState == DISABLED) mWantedState = WantedState.CLOSE;


Now in handleClosing

If(mSystemState != SystemState.CLOSING)
{
mSolenoid.set(false)
}
mMotorLeft.set(0.0);
mMotorRight.set(0.0);

return defaultStateTransfer();


Note that we want to always update motor power, but only update a solenoid when its state changes.

Dana Batali
Dana Batali9:52 AM

note that this would trigger on the first call since the default clause at line 132 would run. Probably better to add the new `case DISABLED` for clarity.

Dana Batali
Dana Batali9:55 AM

this same idea can be applied to each of your handlers (ie: you may not need the mWantedState within each handler, this is taken care of by defaultStateTransfer).

Dana Batali
Dana Batali9:58 AM

in outputToSmartDashboard() you can reduce the amount of 'noise' by, for example:


smartdashboardPutNumber("MotorRight", mMotorRight.get());

Dana Batali
Dana Batali9:59 AM

(ie:
1. there are a number of methods in the superclass to prepend the subsystem name automatically
2. there are routines to put numbers, strings and booleans,)

Dana Batali
Dana Batali10:00 AM

stop should problably be implemented... something like:

mMotorRight.set(0);
mMotorLeflt.set(0);
mSolenoid.set(false);
mSystemState = mSystemState.DISABLED;

Dana Batali
Dana Batali10:03 AM

finally: I'm very optimistic that @Randy Groves’s IRSensor will be able to reliably deliver a cube-presence/absence signal. It's probabably wise to make sure electronics and mechanics are in the loop. Specifically: if we can mount Randy's sensor on the camera plate and make sure we can get a wire from there to the analog input pin, we can officially eliminate the request for a limit switch to detect the cube position.

Dana Batali
Dana Batali10:04 AM

one more finally: as we discussed, the idea that we might want motors running whilst in "open" mode will add more states to your subsystem.

Dana Batali
Dana Batali10:08 AM

one final finally: since you have motors on each side of the harvester, you need to consider which directions you'd like to turn. I suspect that for cube acquisition, the right needs to turn the opposite direction from the left. This can be accomplished in two ways:

1. during motor construction you can invoke setInvert() on the reversed motor

-or-

2. you can send opposing signals to the motors:

m1.set(1.0);
m2.set(-1.0);


I think option #1 is preferrable.

Dana Batali
Dana Batali10:08 AM

that's all for now :nerd_face:

2018-02-09
clint
clint9:36 AM

@clint has joined the channel

Cory_Houser
Cory_Houser3:46 PM

Just letting everyone know that I will be leaving at 6 today and won’t be coming back

Dana Batali
Dana Batali4:05 PM

fyi programming is in the main building

Tarkan Al-Kazily
Tarkan Al-Kazily9:29 PM

@Tarkan Al-Kazily has joined the channel

2018-02-10
Dana Batali
Dana Batali8:35 AM

- yippee, WPI lib update dropped today (18.3.1)... I'll be updating the repo, but programmers should update theirs first thing today.

Dana Batali
Dana Batali8:36 AM

@Declan Freeman-Gleason @Ronan Bennett - i think i found the mysterious bug. The LED subsystem is currently constructing a DigitalOutput based on a relay-ID. We can discuss the fix first thing today.

Declan Freeman-Gleason
Declan Freeman-Gleason9:14 AM

We should probably just remove those, but keep the remainder of the subsystem. We probably want the vision light functionality, and we might want to be able to communicate with bling.

Justice James
Justice James12:50 PM

Hey, I'm going to be slightly late today.

Cory_Houser
Cory_Houser12:53 PM

Are we in the main building today

Declan Freeman-Gleason
Declan Freeman-Gleason12:59 PM

Yes, we're in the 200 building again

Ryan Olney
Ryan Olney2:49 PM

@Dana Batali @Randy Groves After talking to the Engineering team working on the Harvester, It does not sound like there is going to be an IR Sensor to detect when the cube is in, but we are going to use the limit switch.

Darwin Clark
Darwin Clark4:35 PM

@Ryan Olney Did they give a reason why not? Dana and I were interested in using the IR sensor in vision.

Riyadth Al-Kazily
Riyadth Al-Kazily4:39 PM

I think the reason is that they already finished the limit switch. That doesn't mean we can't also add the IR sensor, but we will have to wait until more of the robot is complete. Nobody can commit to any space availability. (Dana says that we should be able to mount the IR sensor on our Raspberry Pi camera board...)

Darwin Clark
Darwin Clark4:40 PM

Ah, okay. That should work fine. I was just concerned that it had been completely discarded, and was concerned.

Riyadth Al-Kazily
Riyadth Al-Kazily4:41 PM

(It's so small we could hide it on the robot and nobody would ever know ;-)

2018-02-11
Justice James
Justice James1:01 PM

Sorry, but I will be slightly late again.

Randy Groves
Randy Groves1:11 PM

Running late today

Binnur Alkazily
Binnur Alkazily1:40 PM

@Ronan Bennett @Noah Martin I see the following entries in the strategy playbook (and I hear Ronan added this). It would be useful to update this w/ our matching test controls from our dashboard

All Discrete Controllable Functions of the Mechanisms

Harvester
Arms Open
Arms closed
Intake inwards
Intake off
Intake eject

Scissor Lift
go to bottom height
go to switch height
go to scale height
go to climb height (separate height?)
manual up (jog)
manual down (jog)

Articulated Grabber
Flipper retract - home position
Flipper extend - grabbing position
Grab cube
Release cube

Climber
Off (no stabilizer arms)
Extend stabilizer arms
possibly two separate functions for each arm individually
Climb
Stop climbing once started

Noah Martin
Noah Martin8:25 PM

we are planning on meeting after school Tuesday 3:15 - 5:00 to test the robot and work on testing specific subsystems. Hope to see everyone who can make it. :grin:

Darwin Clark
Darwin Clark8:38 PM

The completed robot?

Riyadth Al-Kazily
Riyadth Al-Kazily8:42 PM

"Completed" is a very strong word...

Darwin Clark
Darwin Clark8:45 PM

"Polished"

Riyadth Al-Kazily
Riyadth Al-Kazily8:46 PM

I think you're going to get an 'adequate' robot, at least to start.

Justice James
Justice James8:46 PM

Robot > No robot

Chris Rininger
Chris Rininger8:48 PM

Minimum viable robot = drive-able + grabber + lift. You're getting that + the harvester. Climber to come later. Climb with Friends officially cut.

Binnur Alkazily
Binnur Alkazily8:50 PM

^^^climb w/ friend(s) cut due to weight constraints — we hit 110 w/ the robot as is, and each flaps estimated ~5.5lb x2 for a friend

Paul Vibrans
Paul Vibrans8:53 PM

Don't give up just yet. We may be able to climb with one using a single CIM and some serious efforts at weight reduction. Think lightening holes and sniped corners on structure. We won't have it for the first match.

Binnur Alkazily
Binnur Alkazily8:55 PM

@Paul Vibrans This should go without saying, but I will say it — any weight reduction efforts will NOT impede programmers time with the robot :sunglasses:

Paul Vibrans
Paul Vibrans8:56 PM

Of course not.

Chris Rininger
Chris Rininger9:23 PM

Chris Rininger
Chris Rininger9:33 PM

Chris Rininger
Chris Rininger9:42 PM

Speaking of driving, Will said he'll bring in the joystick (Jon's Saitek) he wants to use, and James S. confirmed he would like to try using the new controller board Coach bought that has a digital joystick + 8 large buttons + some smaller buttons. It is located with the other spare joysticks located a few lower cabinets down on the left hand side upon entering the Robot room from the hallway shared with Coach Chee's classroom. It was with electronics, but I put it in that cabinet today, because the original drawer it was in wasn't labeled. I think the cabinet it is in is labeled "Programmer's keyboards" or something like that.

Enrique Chee
Enrique Chee9:56 PM

Make sure we have a backup laptop for the driver station and backups for all joysticks and controller boards.

Chris Rininger
Chris Rininger9:57 PM

Let's hold on buying backup joystick/controller boards until we're sure the ones Will and James picked will work, but generally I agree we should have backups

Chris Rininger
Chris Rininger9:58 PM

The HOTAS stick I asked Coach to buy a while back is in that same cabinet as the operator controller board.

Chris Rininger
Chris Rininger9:58 PM

It's similar to the one Will will bring in - could possibly work as backup.

2018-02-12
Justice James
Justice James7:54 AM

Do we have a meeting after school today?

Declan Freeman-Gleason
Declan Freeman-Gleason8:22 AM

The programmers aren't meeting today

Chris Rininger
Chris Rininger1:50 PM

Here's a 2 cube auto as well for inspiration: https://www.youtube.com/watch?v=7BWLN9Hlk28

Dana Batali
Dana Batali4:45 PM

@Noah Martin @Justice James @Noah Martin: comments/questions on the ArticulatedGrabber

Dana Batali
Dana Batali4:46 PM

line 244:

i don't understand why you would be adding the potValue to the scalePosition (and only in the TRANSPORT case). I would think that scalePosition, intakePosition and homePosition are approximately constant throughout a competition.

Dana Batali
Dana Batali4:49 PM

member variable naming convention:

our coding convention requires that member variables begin with m.. This means that scalePosition, intakePosition, et all should be mScalePosition, mIntakePosition etc.

Dana Batali
Dana Batali4:50 PM

mGrabberSetup - I'm curious what this solenoid's job is?

Dana Batali
Dana Batali4:53 PM

line 152: seems like this will always print spewage since potentiometer values are inherently noisy. I believe you want a !epsilonEquals around these print statements. Or perhaps you want to remove these prints in favor of the outputToSmartDashboard method.

Justice James
Justice James5:00 PM

line 244: We are adding potValue to scalePosition for calibration. We also do that in the prepare_exchange. This is for calibration. Ideally, they would be constant, but in case the potentiometer gets bumped we need these because so much of our code depends on the potentiometer.

Justice James
Justice James5:00 PM

We can refactor the variables on Wednesday.

Dana Batali
Dana Batali5:01 PM

i don't see a way to tune the 3 magic motor/potentiometer values via the dashboard. I believe the steps needed are:

during zeroSensors:


mScalePosition = dashboardGetNumber("Target1", kDefaultScalePosition);
mIntakePosition = dashboardGetNumber("Target2", kDefaultIntakePosition);
..etc..

Dana Batali
Dana Batali5:04 PM

want I think you want is a mPotZero member variable that represents a value you read during zeroSensors. Then you just say:

mScalePosition += mPotZero in zeroSensors (only there, otherwise the value of scale position will be increasing on each loop and that's a big problem)

2018-02-13
Paul Vibrans
Paul Vibrans12:51 PM

Are programmers meeting today, Tuesday?

Binnur Alkazily
Binnur Alkazily12:58 PM

yes - 3:15 to 5pm

Binnur Alkazily
Binnur Alkazily12:59 PM

@Dana Batali will be there from mentors -- I am not sure if any other mentor will be able to make it. Unfortunately, this is turning into a busy work week for me :disappointed:

Binnur Alkazily
Binnur Alkazily1:05 PM

no pressure... but, I am looking forward to videos on programming channel to watch!

Martin Vroom
Martin Vroom3:42 PM

I won't be there today, I have biking which ends at 5:15.

Justice James
Justice James4:12 PM

Sorry I'm not there, I didn't know that we had a meeting. Is there some sort of schedule that was sent out that I missed?

Michelle Dalton
Michelle Dalton4:37 PM

I'm not available as I had to work in the city today. Won't make it back in time.

Ronan Bennett
Ronan Bennett5:10 PM

Noah posted in this channel on Sunday. You didn’t miss any valuable subsystem testing today - we were just setting up the robot

Peter Hall
Peter Hall5:30 PM

Binnur Alkazily
Binnur Alkazily5:37 PM

Is that the teaser!!! What about the scissor? And that was snappy :)

Chris Rininger
Chris Rininger5:59 PM

Cool to see the harvester engage & disengage. Thanks for sharing.

Chris Rininger
Chris Rininger6:03 PM

Regarding the climbing winch & thinking about the states concept, can we make it impossible to start the winch until the lift is high? According to Paul, the geometry of the ropes clearing the lift’s joint protrusions only works when the lift is up at the height where the hooks will catch the rung.

Chris Rininger
Chris Rininger6:04 PM

Starting up the winch with the lift low could be a bad scene

Binnur Alkazily
Binnur Alkazily6:11 PM

^^ yes, super structure can manage the dependency required

Chris Rininger
Chris Rininger6:40 PM

@Paul Vibrans: Do we also need to disallow the winch from raising the robot above a certain height because the lift will fold too much & the lift joints will start conflicting with the ropes?

Chris Rininger
Chris Rininger6:40 PM

This is related to that little side conversation we had.

Dana Batali
Dana Batali6:49 PM

More later but Peter has 10 movies of various components operating.. exec summary: all systems "go"...

Paul Vibrans
Paul Vibrans6:54 PM

I did not check for part height interferences in the CAD model. We should do it with the real robot.

2018-02-14
Peter Hall
Peter Hall7:51 AM

Peter Hall
Peter Hall7:54 AM

Peter Hall
Peter Hall7:55 AM

Peter Hall
Peter Hall7:59 AM

Peter Hall
Peter Hall8:01 AM

Peter Hall
Peter Hall8:06 AM

That's all of the good ones I have

Binnur Alkazily
Binnur Alkazily8:33 AM

@Peter Hall there goes my productivity this morning :) looking forward to watching these. Thanks!!

Violet Advani
Violet Advani8:40 AM

@Violet Advani has joined the channel

Dana Batali
Dana Batali8:42 AM

a testing analysis thread begins

Dana Batali
Dana Batali8:43 AM


UNKNOWN.. [388.87] -3 CTR: CAN frame not received/too-stale. Talon SRX 1 GetBusVoltage at JavacomctrephoenixmotorcontrolcanMotControllerJNIGetBusVoltage
UNKNOWN.. [388.88] -3 CTR: CAN frame not received/too-stale. Talon SRX 1 GetBusVoltage com.ctre.phoenix.motorcontrol.can.MotControllerJNI.GetBusVoltage(Native Method) com.ctre.phoenix.motorcontrol.can.BaseMotorController.getBusVoltage(BaseMotorController.java:390) com.ctre.phoenix.motorcontrol.can.BaseMotorController.getMotorOutputVoltage(BaseMotorController.java:406) com.spartronics4915.lib.util.drivers.TalonSRX4915.getOutputVoltage(TalonSRX4915.java:407) com.spartronics4915.lib.util.drivers.TalonSRX4915Drive.outputToSmartDashboard(TalonSRX4915Drive.java:300) com.spartronics4915.frc2018.subsystems.Drive.outputToSmartDashboard(Drive.java:233) com.spartronics4915.frc2018.SubsystemManager.lambda$outputToSmartDashboard$0(SubsystemManager.java:23) java.util.Arrays$ArrayList.forEach(Arrays.java:3880) com.spartronics4915.frc2018.SubsystemManager.outputToSmartDashboard(SubsystemManager.java:23) com.spartronics4915.frc2018.Robot.allButTestPeriodic(Robot.java:491) com.spartronics4915.frc2018.Robot.teleopPeriodic(Robot.java:371) edu.wpi.first.wpilibj.IterativeRobotBase.loopFunc(IterativeRobotBase.java:213) edu.wpi.first.wpilibj.IterativeRobot.startCompetition(IterativeRobot.java:41) edu.wpi.first.wpilibj.RobotBase.main(RobotBase.java:250)
UNKNOWN.. [579.49] -3 CTR: CAN frame not received/too-stale. Talon SRX 3 GetBusVoltage at Java
comctrephoenixmotorcontrolcanMotControllerJNIGetBusVoltage
UNKNOWN.. [579.50] -3 CTR: CAN frame not received/too-stale. Talon SRX 3 GetBusVoltage com.ctre.phoenix.motorcontrol.can.MotControllerJNI.GetBusVoltage(Native Method)

Dana Batali
Dana Batali9:11 AM

Pot values for scissor lift don't look right:

[34.35] NOTICE... Robot: running test mode ScissorLift variant:raise -------------------------
[34.35] NOTICE... Waiting 5 seconds before running test methods.
[39.35] NOTICE... ScissorLift checkSystem (raise) ------------------
[39.35] NOTICE... ScissorLift raise check -----
[39.36] NOTICE... ScissorLift raise: false (2sec)
[41.36] NOTICE... ScissorLift pot: 609
[41.36] NOTICE... ScissorLift raise: true (3.5 sec)
[44.87] NOTICE... ScissorLift pot: 611
[44.87] NOTICE... ScissorLift raise: false (3.5 sec)
[48.37] NOTICE... ScissorLift pot: 608
[48.37] NOTICE... Robot: ALL SYSTEMS PASSED

Dana Batali
Dana Batali9:32 AM

@Darwin Clark - if you give me the "corrupt" sd card, I can reburn it at home (process takes > 45m)

Dana Batali
Dana Batali9:53 AM

flipper needs auto calibration... Will suggested that we only hit the limit switches during calibration and that our target points should be away from these extremes by some delta. We should probably enable braking on this motor as well as consider current limiting.

Dana Batali
Dana Batali9:57 AM

harvester "hug" needs to implement the SpartIRSensor for its safety. The biggest risk to the robot, I understand, is that we don't want motors to run when the arms are retracted, so if the distance-check doesn't work we could consider a potentiometer on the arms. Since we're getting low on AnalogInput ports on roborio, we could consider wiring it into a talon.

Darwin Clark
Darwin Clark9:58 AM

@Dana Batali the 'corrupt' SD is in robotics room in vision box. At some point I would like to burn another SD just for practice. Declan showed me the extras in the robotics Room.

Dana Batali
Dana Batali9:58 AM

Which also raises the possibility of wiring the flipper potentiometer into the flipper talon motor (this would only be worth it if its an advantage in wiring)

Dana Batali
Dana Batali10:01 AM

@Darwin Clark - turns out the extras are too slow (they are class 4 and we need class 10). I experimented with burning some and they triggered a kernel panic while attempting to boot from them. This is why I purchased the ones you have in your hands. I'm hopeful that 3-4 is sufficient backup for our purposes... Remember: as you make changes, each will be out of date with the others.

Darwin Clark
Darwin Clark10:02 AM

Yep, so if you fix the corrupted one than that should be 3 (perhaps 4 if the big SD has a baby SD inside) including the card in the robot.

Dana Batali
Dana Batali10:03 AM

i have one more at home in what we are calling the backup raspberry pi.

Riyadth Al-Kazily
Riyadth Al-Kazily10:04 AM

At one point the plan was to have the flipper potentiometer wired in to the Talon (along with the limit switches). I think it should be, so that you can take advantage of soft limits in addition to the hard limit switches. (This was the configuration we used for Gaia's elevator.)

Declan Freeman-Gleason
Declan Freeman-Gleason10:16 AM

You can probably just mount the SD card and edit the script, unless I'm missing something, or because mounting is too much trouble.

Darwin Clark
Darwin Clark10:19 AM

Yeah, that's what I thought, I logged into the pi with the monitor yesterday and deleted the script entirely. Even then, it still hung in boot for much longer than normal giving the 'start job is busy' (or something like that) error. I eventually got fed up and just threw in a fresh SD.

Declan Freeman-Gleason
Declan Freeman-Gleason10:20 AM

I mean attaching the SD card to your computer and actually going into the file system.

Darwin Clark
Darwin Clark10:21 AM

That's what I did, but just through the PI, unless I'm missing something

Darwin Clark
Darwin Clark10:22 AM

You can check the SD card and I'm fairly sure the script does not exist anymore.

Ryan Olney
Ryan Olney12:44 PM

I'm probably going to be about 30 minutes late today

Cory_Houser
Cory_Houser3:32 PM

I will be 30 min late today

Jim Carr
Jim Carr7:24 PM

@Jim Carr has joined the channel

2018-02-16
Mike Rosen
Mike Rosen9:52 AM

I'll be there from about 5 to 7 tonight. Sorry for cutting out early. I'll encourage programmers who could use some help / second set of eyes, to hit me up for this: Let's be sure our subsystems can talk to the hardware OK and that our required state transitions work as expected. @Ryan Olney?

Dana Batali
Dana Batali10:03 AM

all: there's yet-another WPI library update. I'll check in updates to github, but everyone: please update yours via eclipse. After the update you should be at version 18.3.2.

Riyadth Al-Kazily
Riyadth Al-Kazily12:23 PM

: ^^^ If you are like me, you will find upgrading your Eclipse libraries much easier at home than at school. (I can never seem to get beyond 29% of an update on the school LAN for some reason...)

2018-02-17
Chris Rininger
Chris Rininger1:04 PM

QQ: Any sense for likelihood we'll have fully functional (including controls) robot for driving practice at 11 tomorrow?

Martin Vroom
Martin Vroom1:12 PM

I'll be late today. Need to eat and my parents were using my computer.

Justice James
Justice James1:17 PM

I'm also running slightly late.

Justice James
Justice James2:51 PM

New WPI library update

Chris Rininger
Chris Rininger4:11 PM

Sharing here because several of the observations pertain to programming

Justice James
Justice James5:07 PM

@Ronan Bennett @Noah Martin @Martin Vroom@Declan Freeman-Gleason I will not be here Sunday-Friday. My code is in my fork, but I haven't submitted a pull request. My code has all the current Manual code for ArticulatedGrabber in comments, marked by TODOs. I can submit a pull request sometime tonight, if anyone wants me to.

Martin Vroom
Martin Vroom5:52 PM

I won't be at robotics. I'll be away until the Monday following tomorrow.

Binnur Alkazily
Binnur Alkazily6:12 PM

just to wrap up our afternoon discussion, the harvesting controls and scissor is operational. With that said,
@Declan Freeman-Gleason I don’t know what code is running on the robot and IF these controls are wired to it. Pls validate if you can.
@Chris Rininger I didn’t get the sense there would be a driving practice tomorrow, though…

Declan Freeman-Gleason
Declan Freeman-Gleason7:24 PM

@Chris Rininger Because of the second robot, we should not allocate time to driving right now. This time is better used fixing the flipper design, or programming. We will have the time later, either just before or after bag day.

Chris Rininger
Chris Rininger7:26 PM

understood - alright team, let's get that 2nd robot done!

Chris Rininger
Chris Rininger11:18 PM

tip worth sharing from team that played in week 0: "word of caution on reading the gamedata: it does NOT come in immediately. This took us all of quals to figure out, even though I now remember seeing a thread about it some time ago. Lesson: read the game data in a loop, and keep looping until you get >0 characters (perhaps with a timeout to be safe)." perhaps we're already doing that; if not, take heed

2018-02-18
Dana Batali
Dana Batali8:29 AM

@Noah Martin @Binnur Alkazily @Riyadth Al-Kazily @Paul Vibrans: flipper limit-switch mystery potentially solved. I see that there is an innocuous Timer.delay(.1) in the inner loop of this test. Seems entirely likely that 1/10 of a second is sufficient time to "miss" the limit switch signal. When we started testing, we purposely selected low power/speed... As testing evolved we increased the speed and thus the likelihood of missing the important moment.

here's the code for ref

{
logNotice("limit switch encounterd at " + mPotentiometer.getValue());
logNotice("mFwdPotentiometerValue before: "
+ mFwdLimitPotentiometerValue);
mFwdLimitPotentiometerValue = mPotentiometer.getAverageValue();
logNotice(
"mFwdPotentiometerValue after: " + mFwdLimitPotentiometerValue);
break;
}
else if (t.hasPeriodPassed(10))
{
logError("fwd 1s didn't encounter limit switch!!!!!!!");
success = false;
break;
}
else
{
Timer.delay(.1);
if (counter++ % 10 == 0)
logNotice(" pot: " + mPotentiometer.getValue());
}

Dana Batali
Dana Batali8:34 AM

I think two changes are needed: 1) set Timer.delay(.01) (or even zero) and 2) change the counter check from 10 to 100 or 1000.

Enrique Chee
Enrique Chee8:38 AM

Probably not, but could this affect why the grabber could not lift with a cube ?

Enrique Chee
Enrique Chee8:39 AM

Just to clarify, I meant grabber move with a cube and not the scissor lift.

Riyadth Al-Kazily
Riyadth Al-Kazily8:46 AM

This would not affect the strength of the pivot arm. We measured the current fed to that motor, and it was nearly at maximum power delivery. If we had run it for more than a few seconds, it would have tripped the breaker to the motor.

Dana Batali
Dana Batali8:56 AM

agreed - this is only an on-off switch, not the strength of the flipper

Dana Batali
Dana Batali8:58 AM

: now they're just being cruel. New wpi update 18.3.3. Be there or be square.

Binnur Alkazily
Binnur Alkazily9:11 AM

for the flipper arm, how are we updating the mLimitSwitchRev & mLimitSwitchFwd values? Mechanics manually adjusted the limit switch location (mRevLimitSwitchRev?) for picking the robot multiple times yesterday. The zeroSensors() will only update that value IF the limit switch is hit. And, looper uses the constants if the switches hit. What am I missing?

Dana Batali
Dana Batali9:15 AM

if (mPositionMotor.get() < 0 && !mLimitSwitchRev.get())
are the checks to look for... The switch is normally closed, fwiw.

Dana Batali
Dana Batali9:16 AM

(line 292)

Binnur Alkazily
Binnur Alkazily9:22 AM

as R said, I need more coffee -- my question is how are we updating the mRevLimitPotentiometerValue & mFwdLimitPotentiometerValue in the code. As I read it so far, the best case one of those values are get set during zeroSensors() and rest of the time we are using our own predefined constonts

Riyadth Al-Kazily
Riyadth Al-Kazily9:24 AM

Line 418 says limit switches are normally open, which matches the logic. Negating the return of the get() method in our ifs indicates to me that a zero means "limit switch pressed", which would indicate a normally open switch (switch pulls to ground when closed, allows signal to float high when open).

Dana Batali
Dana Batali9:25 AM

trust the code, not my pre-coffee meanderings... Anyone sense a pattern here? :slightlysmilingface:

Binnur Alkazily
Binnur Alkazily9:26 AM

just finished my 1st cup and definitely requiring a 2nd one... (hint, hint, @Riyadth Al-Kazily)

Dana Batali
Dana Batali9:27 AM

zeroSensors is currently invoked during autoInit and teleInit. We discussed the idea of recalibrating on any moment where limit switch fires but hadn't decided if it was a good idea

Binnur Alkazily
Binnur Alkazily9:30 AM

in my state of needing more caffeine -- doesn't look like mRev/FwdLimitPotentiometerValue is doing much if anything -- we just need to ensure the pick, place, hold positions are tuned correctly.

Dana Batali
Dana Batali9:33 AM

the current design assumes:
1. potentiometer drift occurs relative to endpoints.
2. set positions, if treated as offsets are stable (we haven't validated this claim).
3. so the problem amounts to computing new setpoints as a combination of measured endpoints and our hard-coded offsets.

Binnur Alkazily
Binnur Alkazily9:34 AM

from our prior experience #2 has been true -- the delta between the set positions have been stable

Dana Batali
Dana Batali9:34 AM

btw: a similar set of assumptions is present within the scissorlift code, so if we find flaws in them we'll need to fix both pieces of code.

Binnur Alkazily
Binnur Alkazily9:36 AM

for the flipper arm, the zeroSensors() for the teleInit may not get triggered based on where the flipper arm is left -- so, we should ensure at the end of autonomous we always go back to a known position for the flipper arm (back to carry?)

Binnur Alkazily
Binnur Alkazily9:36 AM

If we do that, then we should update the following code to always use a known limit sensor position

private void updatePositions()
{
mHoldPosition = mFwdLimitPotentiometerValue -
dashboardGetNumber("Target1", kDefaultHoldOffset).intValue();
mPickPosition = mRevLimitPotentiometerValue +
dashboardGetNumber("Target2", kDefaultPickOffset).intValue();
mPlacePosition = mRevLimitPotentiometerValue +
dashboardGetNumber("Target3", kDefaultPlaceOffset).intValue();

logNotice("hold position: " + mHoldPosition);
logNotice("pick position: " + mPickPosition);
logNotice("place position: " + mPlacePosition);
}

Binnur Alkazily
Binnur Alkazily9:37 AM

^^^ currently this is using both FwdLimit and RevLimit

Justice James
Justice James9:38 AM

The articulatedGrabber team talked it over and decided to not do that, in case we are in a position directly over the scale. But we definitely could change that if wanted.

Riyadth Al-Kazily
Riyadth Al-Kazily9:39 AM

On the topic of limits and potentiometers, I think the design might be easier if the "carry" and "grab" positions were solely determined by limit switch positions, and the "place" position is the only one that is based on a potentiometer-derived position. I can envision a method 'armPosition()' which returns an enumeration of possible arm positions, and in that method it checks limit switches first (to see if at "carry" or "grab" positions), and then checks potentiometer to see if at "place" position. Two additional "positions" of "between carry and place" and "between place and grab" would allow goal directions to be calculated from the enumeration. This way, all the epsilonEquals() stuff happens in one spot (and only with the "place" state). (I am concerned with the large number of places in the code where the epsilonEquals() is used, and how if we have too small a kAcceptablePositionError then we could miss our target).

Riyadth Al-Kazily
Riyadth Al-Kazily9:41 AM

Also, when driving a motor forward, we should always check if we reached our target or any value above that in order to stop the motor. Otherwise, if we use the epsilonEquals() and pass through the acceptable position error, we run the risk of oscillation, since the motor will reverse and try and hit the target again.

Riyadth Al-Kazily
Riyadth Al-Kazily9:43 AM

On the topic of limit switches, I would like to see member variable flags that hold the state of the limit switches, and those flags should be used to limit motion in the direction of the limit switch. This is good practice in the case where the limit switch is triggered, but the motor doesn't stop in time, and the mechanism bypasses the limit switch mechanism. The member variable flag should only be cleared when the opposite motor direction is engaged (ie, clear the forward limit switch flag when the motor is started in reverse).

Dana Batali
Dana Batali9:47 AM

FYI.. Will instructed Noah that limit switches should only be used as fallback. This is what led to the soft limits situation in the code

Riyadth Al-Kazily
Riyadth Al-Kazily9:58 AM

OK, then my last paragraph on limit switch flags should be highlighted. Right now the code could conceivably get past the limit switch (maybe just due to mechanical inertia), so we will need to preserve the state of the switch when it is hit.

Binnur Alkazily
Binnur Alkazily10:00 AM

@Dana Batali if that is the case, that we need to rely on soft limits, then I don't see a reliable setting of soft limit switches on the code

Dana Batali
Dana Batali10:16 AM

our soft limits are embodied in our target positions (with the error band)... This is distinct from the soft limits that cantalon implements. I agree with riyadth that more bullet-proofing is required. And I completely concur that with binnur that there is a real bug in the assumption that zeroSensors at teleop init is a viable place to calibrate. As Justice indicated, the idea of going back to a rest pose after auto (both for scissor and for flipper) was discussed and rejected on the grounds that if we don't complete our mission to ,for example, the scale during auto, it should be minimally easy to complete it as teleop begins.

Binnur Alkazily
Binnur Alkazily10:21 AM

May help w/ visual thinkers

Binnur Alkazily
Binnur Alkazily10:22 AM

and assumes you can read my handwriting :slightlysmilingface:

Binnur Alkazily
Binnur Alkazily10:24 AM

can we validate that autoInit for zeroSensors() is a valid calibration state? Meaning, are we sure the carry position will trigger the fwd limit switch?

Dana Batali
Dana Batali10:26 AM

i believe that knowing the robot is in carry position we might/should just assign that to the carry target value.

Binnur Alkazily
Binnur Alkazily10:29 AM

I am fine with the offset, as long as we know what we are offsetting it from which is assumed given the code. my concern is as the mechanics moves the physical limit switch the originating value changes, yet the assumed+offset does not.
this could mess up the `else if (Util.epsilonEquals(potValue, targetPosition, kAcceptablePositionError))` checks where the targetPosition is the assumed+offset while the potValue is the actual

Binnur Alkazily
Binnur Alkazily10:30 AM

^^ my takeaway is to ensure we have a good starting position for calibration of the assumed==actual

Dana Batali
Dana Batali10:31 AM

that plus making sure we only calibrate at the correct time :wink:

Binnur Alkazily
Binnur Alkazily10:31 AM

YES! :slightlysmilingface:

Dana Batali
Dana Batali10:32 AM

and perhaps if the carry position is the "calibration" position, we need two terms in our vocabulary. One where we test the limit switch positions and another where we measure our carry position.

Binnur Alkazily
Binnur Alkazily10:34 AM

potentially calibration == soft fwd limit switch before reaching the fwd limit switch (this is the switch on ID #2)
sounds like we need to figure out the starting positions are in comparison to physical sensors

Binnur Alkazily
Binnur Alkazily10:36 AM

and I think that position corresponds to the mHoldPosition in code

Binnur Alkazily
Binnur Alkazily10:45 AM

any reason why we are returning the assumed values when we hit limit switches? maybe we should be calling update positions instead (but with more deterministic logic to update the positions)
if (mPositionMotor.get() < 0 && !mLimitSwitchRev.get())
{
logWarning("Articulated Grabber Reverse LimitSwitch Reached");
mPositionMotor.set(0.0);
return mRevLimitPotentiometerValue;
}
else if (mPositionMotor.get() > 0 && !mLimitSwitchFwd.get())
{
logWarning("Articulated Grabber Foward LimitSwitch Reached");
mPositionMotor.set(0.0);
return mFwdLimitPotentiometerValue;
}

Binnur Alkazily
Binnur Alkazily10:46 AM

^^ yes, I understand in theory we should not be hitting the limit switches as we have the soft limit switches in place

Binnur Alkazily
Binnur Alkazily1:57 PM

Ronan Bennett
Ronan Bennett2:02 PM

@Ronan Bennett pinned a message to this channel.

Binnur Alkazily
Binnur Alkazily8:09 PM

- I wanted to give you all a virtual high-fives! It was an intense weekend, and we were able to verify the subsystem functionality for flipper+grabber, scissor, and harvester. At the end of the evening, we harvested a cube, drove to the scale and dropped it off -- tomorrow we'll be integrating all that w/ super structure and plumbing the system.
Get some rest and see you tomorrow!

Binnur Alkazily
Binnur Alkazily8:38 PM

@Darwin Clark you are a bit abstract for me tonight :slightlysmilingface: lets try tomorrow

Binnur Alkazily
Binnur Alkazily8:41 PM

@Ryan Olney -- I updated the handlingOpen() in harvester to 1) spin the motors as if ejecting; 2) open the arms; 3) wait for a timeout and turn off the motors. This was done as the mechanism for the harvester changed and the wheels were getting stuck. @Noah Martin will be merging the changes in tomorrow. However, we still need to test all the harvester buttons (open, close, eject) to make sure I didn't break anything in the process :smiley:

Binnur Alkazily
Binnur Alkazily9:12 PM

@Ronan Bennett nice programming list todos! ALL, pls review and make sure we are not forgetting important things! thank you :slightlysmilingface:

2018-02-19
Darwin Clark
Darwin Clark9:34 AM

@Dana Batali In the process or ordering new IP cameras, we have run into a snag. There is only one D-Link 930L in stock on Amazon. Because of that, we will need to compensate and order one more camera. Mr. Chee wants us to order a camera that is a newer model, and Charlotte suggested to D-Link 942L. I want to confirm this purchase with you, because I know little about the requirements when purchasing and IP cam. Thanks!

Dana Batali
Dana Batali9:38 AM

@Darwin Clark: as soon as we depart from a uniform ip camera across the robots, we can open up our entire arsenal. So here's one plan that doesn't require us to order any new cameras:

1. we employ the axis camera as the scissor cam for robot 1
2. we employ the two old Dlink-930 on the second robot.

That said, I'm confused... We were looking for the 933L, of which there are 5 in stock at amazon:

https://www.amazon.com/D-Link-DCS-933L-Night-Camera-Extender/dp/B00CAT0QMQ

Dana Batali
Dana Batali9:40 AM

also: if you peruse the electronic_pneumatics channel, you'll see that this is the link that i provided peter on feb 10 at 9:41am.

Dana Batali
Dana Batali9:41 AM

Darwin Clark
Darwin Clark9:41 AM

I'm just relaying what was said to me, I will forward this link to Charlotte. I was also thinking we could use to expensive Axis camera, but I was unsure about using a very expensive piece of hardware.

Dana Batali
Dana Batali9:43 AM

I'm all for using hardware that we have (no matter how expensive) over buying new stuff if it can be made to work. I merely wouldn't recommend buying any new axis cameras since they appear to be 2-3x more expensive than other models. Did you complete the axis analysis yesterday? Did you write up your conclusions anywhere?

Dana Batali
Dana Batali9:49 AM

btw: newer cameras tend to have more features we don't need, higher resolution, infra-red, sound detection, so while the 942L appears appears to work (vga (low) res, wired-ethernet are the requirements), i don't think its newness buys us anything. There are a number of refurbished (amazon prime) units here: https://www.amazon.com/gp/offer-listing/B00CAT0QMQ.

Cory_Houser
Cory_Houser9:56 AM

I’m gonna be 15 min late to the meeting today

2018-02-20
Dana Batali
Dana Batali9:45 AM

a thread for superstructure review notes

Dana Batali
Dana Batali9:46 AM

code review notes (some purely cosmetic) from superstructure:

1. no constructor? I always like to see a constructor with all vars initialized therein
2. not sure yet if I buy your use of timers between state changes, but if they are necessary, you might consider using DelayedBoolean.
3. whitespace appears a little borked
4. the need for mStateChanged seems weird to me. If you want do something at the moment a state changes, why not do it at that point? Then the specific significance of a particular state change activity would be more clear.

Dana Batali
Dana Batali9:48 AM

Seems to me (so far) that by introducing some more system states (like WAITINGFORGRABBERTOOPEN), the code might be made simpler

Dana Batali
Dana Batali10:23 AM

@Declan Freeman-Gleason: fyi - i just pushed to my branch an experimental variant of the superstructure state management. Perhaps we can go over it first thing this am?

Declan Freeman-Gleason
Declan Freeman-Gleason10:41 AM

I've read through your changes. The changes are reasonable.

Binnur Alkazily
Binnur Alkazily12:49 PM

172.22.11.2

Dana Batali
Dana Batali1:34 PM

unbricking roborio thread (support Reference#3119650)

Dana Batali
Dana Batali1:34 PM

Note: Your reference number is included in the subject field of this message. It is very important not to remove or modify this reference number, or your message may be returned to you.

Hi Dana,

This is Austin with National Instruments. You can find the recovery files you need on our FTP site:



The file is called 4915.zip. It is a double-zipped, password protected file. The password for both layers of the file is team_4915. The file you need is called recovery.cfg. Here are the steps to reconfigure your roboRIO:

1. Find a flash drive, remove everything from it, and reformat it as a FAT32 drive by right-clicking on the flash drive.
2. Download the specified file. Unzip it and place only the .cfg file on the flash drive.
3. Turn off your roboRIO.
4. Plug in the FAT32 formatted flash drive with only the CFG file on it into one of the two USB ports on the roboRIO.
5. While holding down the Reset button, turn on the roboRIO
6. Let go of the Reset button after the STATUS LED has turned on and remains solid.

As I mentioned, this won't work every time and is highly dependent on the USB drive used. If it doesn't work immediately, try any USB drives you can find. If you can't find one that works, let us know and we can schedule an RMA. If you have any trouble finding or opening this file, please let me know with a reply to this email. Thank you for contacting National Instruments!

Regards,
Austin Elledge
Applications Engineer
National Instruments
http://www.ni.com/support

Riyadth Al-Kazily
Riyadth Al-Kazily5:42 PM

I have tried and tried and tried with my two thumb drives. No luck. I've reformatted, and tried again. Still no luck...

2018-02-21
Dana Batali
Dana Batali11:22 AM

@Ronan Bennett @Noah Martin @Enrique Chee - since the remedy that NI offered appears to have failed, the next step is for one of the students leaders to call NI again and see if they'll offer us a replacement. If the replacement costs $$$, we'll need to consult with coach.

It's possible that email plus the support reference number could be used to go to the next step too. The email above was sent from

Enrique Chee
Enrique Chee12:14 PM

Thanks for working on this and keeping me updated.

Ronan Bennett
Ronan Bennett6:24 PM

It looks like they’re closed right now but I’ll call tomorrow and see if we can get a replacement

2018-02-22
Dana Batali
Dana Batali11:26 AM

yapi (yet another pid introduction): https://www.csimn.com/CSIpages/PIDforDummies.html

Dana Batali
Dana Batali11:54 AM

got this email today after I filled in the automated customer-support survey:


Hi Dana,


Thank you for the feedback. We're disappointed that our recovery process did not work for you. Was there any other aspect of support that could have been improved? We strive to provide all our customers with the best support possible and we want to address any areas where we can improve.

Regarding the RMA, I can help you with that. I need the following information to process the RMA:

- Name of contact to receive a new roboRIO
- Email address of contact
- Phone number for contact
- Street address to ship a new roboRIO to
- Serial number of nonfunctioning roboRIO

Ronan Bennett
Ronan Bennett2:47 PM

Unless someone knows the roboRIO serial number, it looks like we should wait until the next meeting to actually request the RMA right?

Enrique Chee
Enrique Chee4:46 PM

Yes

Riyadth Al-Kazily
Riyadth Al-Kazily6:28 PM

@Darwin Clark I have ordered a new relay that we should be able to use for the headlights. Peter was concerned because there is a semi-ambiguous rule that states that relays cannot be modified (but I did have to modify the relays to make them work). I think it will all be fine, but just in case we should prepare for a swap at the first competition. What this means for you: The new relay is going to connect to a DigitalOutput pin, and not a Relay pin. That means you will need to create a DigitalOutput object on an unused pin. I think you should use DIO pin 9 (since it is easier to plug into than the lower numbers), and I think you should keep the old code in place alongside the new code. By that I mean create both the relay and digital output objects, and when you turn the lights on, turn on both ports. It won't matter if we only have one or the other plugged in, so the code can be the same for both configurations.

Riyadth Al-Kazily
Riyadth Al-Kazily6:29 PM

I would also strongly recommend that you only have the lights on when needed, such as when the harvester arms are extended. That will reduce the chances of the judges complaining that the lights are too bright and always shining in someones eyes.

Riyadth Al-Kazily
Riyadth Al-Kazily6:48 PM

Also, it would be wise to turn off the headlights if the flipper arm is trying to lift up a cube. Anything we can do to reduce power consumption elsewhere on the robot would help with reliability.

Binnur Alkazily
Binnur Alkazily8:43 PM

@Declan Freeman-Gleason noted the 2nd cube autonomous paths — what is your strategy?
and related, earlier was thinking about the strange pid experience we had over the path testing the other day — thinking it would be useful to do short test path combos to see how well cheesy path works with corners and angles.

2018-02-23
Declan Freeman-Gleason
Declan Freeman-Gleason10:04 AM

@Binnur Alkazily The robot comes around from dropping a cube at the switch to grab another cube. Then it decides to either drop it on the scale or if it can't do that then drop it on the switch. I'm not even going to bother with the scale yet, because I don't think our robot is fast enough when handling cubes (driving is fine).

Binnur Alkazily
Binnur Alkazily10:05 AM

are you looking to incorporate vision to pickup the cube? or path planning?

Declan Freeman-Gleason
Declan Freeman-Gleason10:09 AM

All of that driving is path planning. I'm not going to add the complexity of vision until we see a need for it, at least not before the first competition.

Darwin Clark
Darwin Clark11:18 PM

This is where things get interesting. The issue with turning off the LED's when not using them is that for the driver-assisted cube grabbing we want to be able to send a bit over to the driver station that says "I can snag a cube", however that can only be sent over if we have the algorithm running with the assistance of the LED's. There's a few solutions that involve turning off the LED's, but in general I would less confident if we didn't have the LED's on all the time, however if that's what needs to be done, than we can work around it.

Darwin Clark
Darwin Clark11:20 PM

However, when we have a cube in the harvester, we can turn off the LED's no problem.

2018-02-25
Declan Freeman-Gleason
Declan Freeman-Gleason12:50 PM

https://www.chiefdelphi.com/media/papers/download/5212 -- Interesting paper on the linearity of FRC drivetrains, and the implications of that finding

2018-02-26
Enrique Chee
Enrique Chee6:56 PM

At our next meeting on Wed. can someone initiate the RMA process for the Roborio. Thanks

Binnur Alkazily
Binnur Alkazily7:41 PM

@Ronan Bennett ^^ pls take this on or delegate

Ronan Bennett
Ronan Bennett8:05 PM

Yep, I’ll get on it

Declan Freeman-Gleason
Declan Freeman-Gleason10:03 PM

You don't need to show up tomorrow unless you're a programming lead, or Darwin. Mentors don't need to be there either.

Darwin Clark
Darwin Clark10:30 PM

@Declan Freeman-Gleason @Dana Batali Hey guys, I created a PR for the vision states. More info in the PR. (https://github.com/Spartronics4915/2018-POWERUP/pull/70)

Declan Freeman-Gleason
Declan Freeman-Gleason10:52 PM

I left two comments, but they're mostly superficial, and I'm going to give it a better look over later.

2018-02-27
Riyadth Al-Kazily
Riyadth Al-Kazily8:01 PM

@Declan Freeman-Gleason You might want to say "you don't need to show up on Tuesday..." in future. If it wasn't for Binnur correcting me, I might have skipped the meeting tomorrow (Wednesday) because I didn't look at when you posted your message :-) (OK, I wouldn't have skipped it, because robots!)

2018-02-28
Mark Tarlton
Mark Tarlton3:50 PM

@Declan Freeman-Gleason -- I looked at the harvester code and, as you suggested, the arms will open as soon as the cube leaves the edge of the robot. On slick floors, it works fine but on the playing field, the Harvester class's "HandleEjecting" method needs to stay active as long as the cube is visible to the IR sensor. One idea is to add a "isTargetVisible" method to the SpartIRSensor class that returns true if the voltage is above 0.4V. This is cheaper than using the getDistance() method. Then add the corresponding "isCubeVisible()" method to the Harvester class for use by HandleEjecting(). This should handle the eject case without affecting the other modes.

Declan Freeman-Gleason
Declan Freeman-Gleason4:01 PM

@Mark Tarlton I added a timer because I didn't have the time to spend tuning IR sensor ranges/voltages. It seemed to work pretty well.

Austin Smith
Austin Smith4:26 PM

I’ll be there around 5:30

Austin Smith
Austin Smith4:26 PM

Maybe 5:45.

Justice James
Justice James4:26 PM

FYI programming is done at 7:30

Austin Smith
Austin Smith4:27 PM

Thanks for letting me know

Binnur Alkazily
Binnur Alkazily5:46 PM

On the 5:45 - will be there ~ 6:30

Binnur Alkazily
Binnur Alkazily8:22 PM

@Dana Batali @Michelle Dalton ^^^ are we missing anything that we should have packed?

Dana Batali
Dana Batali8:47 PM

HDMI cable
Extra power supplies for cameras and raspi
Extra raspi microsd cards
Backup cameras, hubs?

Binnur Alkazily
Binnur Alkazily8:58 PM

@Ronan Bennett anything we don’t have in house ping - there is a good chance mentors have them in-house.

Binnur Alkazily
Binnur Alkazily8:59 PM

Dana - as I understand we are not going to be able to do vision for the 1st competition.

Ronan Bennett
Ronan Bennett9:24 PM

Ok I didn’t have time to check today but if we’re missing something tomorrow I’ll say

2018-03-01
Ethan Rininger
Ethan Rininger5:27 PM

We should go for the new "Autonomous Award" in our second competition!

2018-03-02
Chris Rininger
Chris Rininger10:23 AM

Saw scissor lift in match today start with cube and lift already elevated to switch height for quicker dropoff during auto. Seemed worth mentioning as an idea to steal

Enrique Chee
Enrique Chee10:33 AM

Good pt !

Chris Rininger
Chris Rininger11:18 AM

2262 at Wooster polytechnic week one scissors lift bot

Dana Batali
Dana Batali11:27 AM

fyi - this was certainly something we talked about and experimented with. I'm uncertain where the code is at the moment.

Dana Batali
Dana Batali11:28 AM

(actually, what we were thinking is that we elevate during driving, i think).

Declan Freeman-Gleason
Declan Freeman-Gleason11:29 AM

We can place at the switch without moving the scissor lift. That's what we currently do in auto.

Chris Rininger
Chris Rininger11:33 AM

Great did not realize that thx

2018-03-03
Dana Batali
Dana Batali8:03 AM

Where are we on a lower scale scissor position? I raise this question because I saw a surprising number of cube bounce outs in yesterdays clacamas matches

Declan Freeman-Gleason
Declan Freeman-Gleason8:05 AM

I need to get a potentiometer reading, but I'm going to set a lower position (probably right above the middle scale height.)

Dana Batali
Dana Batali8:12 AM

probably don't need to remind you: be very careful in new code deployments. As a rule: no new code on a competition robot unless its been tested in the pit or playing field.

Binnur Alkazily
Binnur Alkazily8:36 AM

we need to provide programming assistance

Binnur Alkazily
Binnur Alkazily10:53 AM

@Dana Batali we had practice matches where the grabber would not open. We think it maybe a state transition issue from auto to teleop as the grabber requires the upstream solenoid (grabberSetup) to be always on. We made code change to ensure onLoop always set(true) for the grabberSetup (vs checking to see if setup required). It is a strange issue as when we were testing it off the field w/ practice mode the system functioned as expected. FYI

Binnur Alkazily
Binnur Alkazily10:55 AM

@Declan Freeman-Gleason what autonomous paths were used during the 3 practice matches

Dana Batali
Dana Batali10:56 AM

I will take a peek and see if anything sticks out

Declan Freeman-Gleason
Declan Freeman-Gleason11:00 AM

Practice match 1: C: Drive to Switch, Practice match 2: B: Drive to Switch, B: Drive to Switch

Binnur Alkazily
Binnur Alkazily11:02 AM

^^ weird enough, practice match 1 was good - system works in auto and teleop.
Code was pushed to update auto paths after match 1. However, neither auto nor teleop worked for match 2 or 3 for opening grabber.

Dana Batali
Dana Batali11:39 AM

@Binnur Alkazily: so far here's some notes:

my reading is that its hard to imagine mSystemState.grabberSetup being the cause. I'm more inclined to think that the logic in handleGrabber (based on mNextState) is the issue

as far as I can tell, there is no reason that stop() should execute between auto and tele. A careful look at the logs might be required to truly understand this issue. As far as I can tell, this is the only place that mGrabberSetup is set to false. If this is being called and the code is overriding system state, then mNextSteate might need to be set to the same as mSystemState.

if we can avoid repeatedly doing the same thing in onLoop, we should

there is some confusion between the use of mNextState and mSystemState. I would argue that mNextState should be a private-to-looper variable and mSystemState should be the thing that reflects the current system state. In particular, handleGrabberState shouldn't probably look at the mNextState, but rather mSystemState.

Dana Batali
Dana Batali11:42 AM

* one other thought (this one sounds plausible): The RELEASE_CUBE wanted state will only work if we're near the place position. If we're not at the place position, I guess the grabber may not release?

Binnur Alkazily
Binnur Alkazily11:45 AM

The last item, the release_cube not near the place position is my next suspicious area.

Binnur Alkazily
Binnur Alkazily11:45 AM

We seem to be having a connection issue — on the competition field right now

Dana Batali
Dana Batali11:46 AM

sigh...

Binnur Alkazily
Binnur Alkazily11:46 AM

Wondering if issue relates to the potentiometer value somehow —

Binnur Alkazily
Binnur Alkazily11:46 AM

Declan says we use the lazysolonoid call, so if it is already true it should quickly pass thru

Dana Batali
Dana Batali11:47 AM

seems to me we need a release button that may not be dependent upon the flipper position?

Binnur Alkazily
Binnur Alkazily11:47 AM

I agree on that! I suggest we add that regardless

Binnur Alkazily
Binnur Alkazily11:48 AM

I think we maybe good??? About to start??

Dana Batali
Dana Batali12:01 PM

the get() call goes all the way through the hardware abstraction layer... Not sure if its querying the can bus, but still best to avoid unless needed.

Binnur Alkazily
Binnur Alkazily12:20 PM

Auto log from the case where the grabber did not open

Binnur Alkazily
Binnur Alkazily12:21 PM

When auto worked and grabber opened to drop, logs show RELEASE_CUBE

Binnur Alkazily
Binnur Alkazily12:23 PM

The autonomous logs where grabber released cube

Dana Batali
Dana Batali12:27 PM

first log shows that we go from auto to disabled to periodic... Is this from a test match or a rel one?

Binnur Alkazily
Binnur Alkazily12:27 PM

Both of the logs were from actual practice match on the field

Binnur Alkazily
Binnur Alkazily12:28 PM

Noticing when we didn’t release cube last state printed is PREPARE_DROP

Dana Batali
Dana Batali12:28 PM

would love to see the log from the match... Why no auto?

Binnur Alkazily
Binnur Alkazily12:30 PM

Logs show nothing was selelected. We think the refs reset the auto selection? As they changed us back to static IP (I guess during practice to solve connection issues they put us on mdns)

Binnur Alkazily
Binnur Alkazily12:30 PM

Declan setting default to move forward

Dana Batali
Dana Batali12:30 PM

for the real match does it showthat we enter disabled between auto and tele?

Dana Batali
Dana Batali12:32 PM

if so, then disabledInit calls zeroSensors which would be bad in the middle of a match

Binnur Alkazily
Binnur Alkazily12:33 PM

Yes

Declan Freeman-Gleason
Declan Freeman-Gleason12:34 PM

I'll remove zeroSensors there...

Binnur Alkazily
Binnur Alkazily12:34 PM

Last match w/ no auto running

Binnur Alkazily
Binnur Alkazily12:35 PM

Dana Batali
Dana Batali12:38 PM

importantly, this implies that we stop, then restart the looper, so the subsystems' stop() method would indeed be called

Binnur Alkazily
Binnur Alkazily12:38 PM


Which explains the grabber not opening

Dana Batali
Dana Batali12:39 PM

not necessarily: if the stop method is called, then the start method would be called which samples the current solenoid state

<edit> each subsystems' onStart and onStop...

autoInit
onStart()
disabledInit
zeroSensors()
onStop()
teleInit
onStart()

Binnur Alkazily
Binnur Alkazily12:40 PM

Looks like disabledInit is always getting called between auto to teleop

Binnur Alkazily
Binnur Alkazily12:41 PM

I go back to my question on why PREPAREDROP is the state when grabber did not open (instead of RELEASECUBE when grabber opened as expected)

Binnur Alkazily
Binnur Alkazily12:42 PM

Do we have a log statement for onStart?

Dana Batali
Dana Batali12:42 PM

my current hypothesis is that the zerosensor call wacked out the potentiomer values (?) now the RELEASE_CUBE measures itself in the wrong zone

Binnur Alkazily
Binnur Alkazily12:43 PM

I buy that — Declan added a ‘override open’ button which should work around potentiometer issue

Dana Batali
Dana Batali12:44 PM

re onStart logs: we would need one for each subsystem's loop. Probably clearer to only have one for the Looper.

Binnur Alkazily
Binnur Alkazily1:20 PM

I am looking at code - looks like autonomous always requests RELEASECUBE wanted state. Given that, why is logger printing out PREPAREDROP? What am I missing in the code?

Declan Freeman-Gleason
Declan Freeman-Gleason1:21 PM

We tell the robot to prepare drop as it's driving to the switch

Dana Batali
Dana Batali1:28 PM

nice match!

Dana Batali
Dana Batali3:51 PM

2910 has a great scale robot!

Dana Batali
Dana Batali3:52 PM

and a great climber! (jack in the bot)

Dana Batali
Dana Batali7:46 PM

team! right now (end of saturday) we're third in OPR!!!

Paul Vibrans
Paul Vibrans8:04 PM

Where do I find OPR?

Binnur Alkazily
Binnur Alkazily8:08 PM

Wow!! Let’s hope for a good run tomorrow

Dana Batali
Dana Batali8:18 PM

right now we're 5th in opr:

https://www.thebluealliance.com/event/2018wamou

Binnur Alkazily
Binnur Alkazily9:25 PM

@Dana Batali thanks for the remote support today - much appreciated!!

2018-03-04
Randy Groves
Randy Groves3:16 PM

@Jon Coonan Attack their cube!

2018-03-05
Mike Rosen
Mike Rosen12:14 PM

I’m traveling this week and will be unable to attend our meetings.

Great day yesterday. Congrats!!

2018-03-06
Chris Rininger
Chris Rininger8:54 AM

I'm still interested in a high camera, and since I've heard we don't need the climber targeting camera, it seems like we have a camera slot so spend if we can come up with a way to get the camera up there. Here's one idea: Use a telescoping pole + an ultra small lightweight IP camera. Either use the lift to raise it or use some other way. Here are example of telescoping pole & tiny IP camera. Thoughts? Other ideas?
https://www.amazon.com/gp/product/B00BIEBEPM/ref=oxscsfltitle1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER
https://www.amazon.com/Phylink-PLC-128PW-Wireless-Pinhole-Detection/dp/B00N8DOAWA/ref=sr14?s=electronics&ie=UTF8&qid=1520354962&sr=1-4&keywords=tiny+ip+camera&dpID=41KOvqr1sfL&preST=SX300QL70&dpSrc=srch