james (james@slattery.tech)
2016-05-06 17:34:37

@james has joined the channel

james (james@slattery.tech)
2016-06-14 12:46:08

@james set the channel purpose: Code the new website spartronics.onion :D

james (james@slattery.tech)
2016-06-14 12:46:47

added an integration to this channel: github

james (james@slattery.tech)
2016-06-14 12:48:09

^ Spartronics Github account still needs to be added. Don't know who has those credentials

Clio Batali (cliombatali@gmail.com)
2016-06-14 22:10:14

@Clio Batali has joined the channel

alex_lf (al3xlf@gmail.com)
2016-06-22 19:34:31

@alex_lf has joined the channel

joncoonan (jonathancoonan@gmail.com)
2016-07-18 14:45:25

@joncoonan has joined the channel

coachchee (echee@bisd303.org)
2016-09-13 22:05:08

@coachchee has joined the channel

james (james@slattery.tech)
2016-09-14 10:50:08

@james set the channel topic: Things about programming

dana_batali (dana.batali@gmail.com)
2016-09-17 11:24:04

@dana_batali has joined the channel

binnur (binnur.alkazily@gmail.com)
2016-09-17 12:44:50

@binnur has joined the channel

michelle_dalton (mdalton@olypm.com)
2016-09-17 17:26:04

@michelle_dalton has joined the channel

james (james@slattery.tech)
2016-09-17 21:23:06

@james set the channel purpose: make robot move

riyadth (riyadth@gmail.com)
2016-09-18 11:41:06

@riyadth has joined the channel

dana_batali (dana.batali@gmail.com)
2016-09-18 17:37:44

for new programmers: http://psgraphics.blogspot.com/2016/09/a-new-programmers-attitude-should-be.html

psgraphics.blogspot.com
chrisrin (chrisrin@microsoft.com)
2016-09-19 08:39:26

@chrisrin has joined the channel

chrisrin (chrisrin@microsoft.com)
2016-09-19 08:44:46

Hey, I took the first few weeks of a class on robotics controls / automation this summer before I got too busy to keep up with it. It's a free course on Coursera & they start a new session every 2 weeks. I believe you can sign up for the course and then have access to all the course materials, including many, many videos. I have a feeling the teams that excel at automation are deep into the concepts the course teaches, and I was thinking there might be a way for a group of Spartronics mentors & students to at least survey the course. Here's a link: https://www.coursera.org/learn/mobile-robot --> check it out!

Coursera
Clio Batali (cliombatali@gmail.com)
2016-09-19 09:43:06

Looks cool, thanks!

timo_lahtinen (timolahtinen1@gmail.com)
2016-09-21 16:22:25

@timo_lahtinen has joined the channel

chrisrin (chrisrin@microsoft.com)
2016-09-23 18:26:43

I joined a FIRST mentor discussion group at Microsoft, and there's a coffee chat coming up about how to approach automation and how a team can learn & improve. A few people with experience are going to share insights. I'm planning to attend, and if there are any questions / topics anyone would like raised, please let me know.

Clio Batali (cliombatali@gmail.com)
2016-09-24 08:12:50

If anything about the new new radio comes up, any insight would be great! Also thoughts on how to handle the preseason and a larger programming team (this year is looking like we'll have to train about 15 freshmen). Thanks!

Clio Batali (cliombatali@gmail.com)
2016-09-24 08:14:08

(@chrisrin: )

Clio Batali (cliombatali@gmail.com)
2016-09-24 21:09:38

Okay, update on ARES: the motor controllers were fine after resetting them (12 was acting up today, but 10 was fine), but the launcher module wasn't working. We loaded the most updated code on github to the robot to make sure our versioning was correct, and that solved the problem. We stress tested for a solid hour, and all the relevant autonomous work fine (except the low bar auto - the portcullis arm drops, the launcher repositions to neutral, but the robot doesn't drive at all). A chain fell off (all fixed), and the light positioning needs to be re-calibrated. In all, the robot is functional now! Wiring checked (we'll need to keep an eye on a few things before competition) and that auto needs to be figured out, but ARES is ready to use for out girls' gen meetings.

Clio Batali (cliombatali@gmail.com)
2016-09-24 21:09:55

**our

coachchee (echee@bisd303.org)
2016-09-24 21:11:31

Thanks for the recap !!

jack (jack@phroa.net)
2016-09-24 23:16:23

@jack has joined the channel

riyadth (riyadth@gmail.com)
2016-09-25 20:52:17

I found a couple of nice videos describing PID control, in case anyone wants to find out more about what it really is and how it works: https://www.youtube.com/watch?v=UR0hOmjaHp0 https://www.youtube.com/watch?v=XfAt6hNV8XM

YouTube
} Brian Douglas (https://www.youtube.com/user/ControlLectures)
YouTube
} Brian Douglas (https://www.youtube.com/user/ControlLectures)
👍 chrisrin
jack (jack@phroa.net)
2016-09-27 17:05:23

sign up and get a free shirt for making 4 pull requests (whether or not they're accepted) to any github project in october https://www.digitalocean.com/company/blog/ready-set-hacktoberfest/

DigitalOcean
chrisrin (chrisrin@microsoft.com)
2016-09-28 07:42:14

Riyadth, those videos are great! Thanks for sharing.

lia_johansen (lilixlucky@gmail.com)
2016-10-01 22:48:39

@lia_johansen has joined the channel

jeff_dalton (chezdalton@gmail.com)
2016-10-05 15:23:24

@jeff_dalton has joined the channel

dana_batali (dana.batali@gmail.com)
2016-10-06 15:25:21

a nice summary of motors and motor controllers: https://www.youtube.com/watch?v=5thxBgew7N0&feature=youtu.be

YouTube
} SparkFun Electronics (https://www.youtube.com/user/sparkfun)
Clio Batali (cliombatali@gmail.com)
2016-10-06 15:28:00

Thanks for sharing!

timo_lahtinen (timolahtinen1@gmail.com)
2016-10-06 19:56:35

For next team meeting with programming: Lia and I think that teaching the new team members about git / creating an account, forking, installing eclipse and plugins, and possibly pushing would make for a good first subteam meeting. Mentors - do you have any suggestions? Jack - would you be willing to help teach git?

riyadth (riyadth@gmail.com)
2016-10-06 20:27:33

I think that's a good plan, but it might be a lot for the new people to get done in one session. Also, it doesn't get their fingers in the code. Maybe if you skip forking/branching/pushing on the first round, but instead:

  1. set up the development environment
  2. create a github account and clone the Stronghold repo
  3. compile the code and download it to the robot (introduces networking concepts)
  4. if they have some idea about Java already, maybe change something in the code and see the change after they download (we might have to come up with a suggested "assignment")
riyadth (riyadth@gmail.com)
2016-10-06 20:29:58

That way we could save the Git tutorial for the next time, when they already have a copy of the repository to refer to. You can explain how the programming subteams all work together on the main code by each working in your own repos. New people really need to understand that workflow well, so it's best to teach it to them without also having to go through the hassle of getting all the tools working.

jack (jack@phroa.net)
2016-10-06 20:38:17

There must be a way to teach git that a) doesn't confuse people too badly, b) doesn't require homework/reading at home (nobody ever does)

jack (jack@phroa.net)
2016-10-06 20:38:39

I can't think of it though...

jack (jack@phroa.net)
2016-10-06 20:39:36

(to actually answer your question, I wouldn't mind helping teach git but I told Chee I wouldn't be a leader... I guess that makes me a student mentor then)

coachchee (echee@bisd303.org)
2016-10-06 20:43:01

Just because you are not in leadership, does not mean you can't help. Go for it. We need your help. We are a team.

james (james@slattery.tech)
2016-10-06 20:43:49

@jack code academy came out with a git thing I think

lia_johansen (lilixlucky@gmail.com)
2016-10-06 20:54:21

Riyadth, good idea. Thanks. We will have them work with github and the code to become familiar. Us leaders can explain how we work and communicate throughout the subteams.

dana_batali (dana.batali@gmail.com)
2016-10-07 09:31:55

i, too, would recommend delaying discussions of git in favor of focus on robot code, eclipse setup, and even “starting from scratch” with robot code (from WPI tutorials)...

dana_batali (dana.batali@gmail.com)
2016-10-10 10:30:17

one of the central device-types we need to program is the servo motor: Here is some helpful background on design and control aspects:

http://www.robotplatform.com/knowledge/servo/servo_tutorial.html

robotplatform.com
dana_batali (dana.batali@gmail.com)
2016-10-10 10:30:19

http://www.robotplatform.com/knowledge/servo/servo_control_tutorial.html

robotplatform.com
binnur (binnur.alkazily@gmail.com)
2016-10-12 20:47:28

@lia_johansen you already have the ‘manager’ rights on the google programming group if you want to set that up in the mean time - fyi

dana_batali (dana.batali@gmail.com)
2016-10-13 15:32:24

hm … these changes are likely to cause us a little more pain this year than last. Not a giant amount, but best for electronics and programmers to read the announcement a couple times.

dana_batali (dana.batali@gmail.com)
2016-10-15 13:51:26

Here are some slides from a recent nvidia webinar on "deep learning". This is a common approach to machine vision and is "all the rage" right now: http://on-demand.gputechconf.com/gtc/2016/webinar/embedded-deep-learning-nvidia-jetson.pdf

coachchee (echee@bisd303.org)
2016-10-15 22:27:40
2016-10-15 22:27:40

A file, which can't be shown because your team is past the free storage limit, was commented on.

coachchee (echee@bisd303.org)
2016-10-15 22:32:24
2016-10-15 22:36:05

A file, which can't be shown because your team is past the free storage limit, was commented on.

coachchee (echee@bisd303.org)
2016-10-15 22:53:50

@coachchee pinned their File to this channel.

coachchee (echee@bisd303.org)
2016-10-15 22:53:59

@coachchee pinned their File to this channel.

coachchee (echee@bisd303.org)
2016-10-15 23:42:02

Here is a link for Java programming in FIRST. Must read for all programmers. https://wpilib.screenstepslive.com/s/4485/m/13809

coachchee (echee@bisd303.org)
2016-10-15 23:42:11

@coachchee pinned a message to this channel.

} Enrique Chee (https://spartronics.slack.com/team/U1AQU2B97)
B1GTJAH98
2016-10-16 10:35:34
B1GTJAH98
2016-10-16 10:38:11
<p>[Spartronics4915/2016-Stronghold] Pull request submitted by <a href="https://github.com/mdalton-spartronics">mdalton-spartronics</a></p>
B1GTJAH98
2016-10-16 10:43:36
B1GTJAH98
2016-10-16 10:43:36
<p><a href="https://github.com/Spartronics4915/2016-Stronghold/tree/master">[2016-Stronghold:master]</a> <a href="https://github.com/Spartronics4915/2016-Stronghold/compare/11866539e1a5...621ce18e011d">2 new commits</a> by mdalton-spartronics and 1 other:</p>
B1GTJAH98
2016-10-16 10:43:41
<p>[Spartronics4915/2016-Stronghold] New comment by binnur on pull request <a href="https://github.com/Spartronics4915/2016-Stronghold/pull/174#issuecomment-254061608">#174: Autonomous fixes for Girls Gen</a></p>
binnur (binnur.alkazily@gmail.com)
2016-10-16 10:47:55

@lia_johansen please pull the latest code for girls gen.

We fixed Portcullis.java code that had matching ‘ { ‘ issues. Moving forward, we will be dictating coding style guide to minimize this problem.

As a note, the autonomous selection was still wonky. Looking at this year’s code, the logic is more complex than it needs to be - this is an area we will be simplifying as well.

binnur (binnur.alkazily@gmail.com)
2016-10-16 10:50:35

… whoever integrated the github bot to this channel, I like it and I am concerned about the amount of noise it will generate when we deep dive into the programming season. And, I am curious enough to see how well it will work. If the channel becomes noisy, we can address that later on.

jack (jack@phroa.net)
2016-10-16 15:37:25

@binnur I can move it to a new channel (#commits or something) if it's an issue

binnur (binnur.alkazily@gmail.com)
2016-10-16 15:38:24

figured 🙂 we can hold off and see how it works when we launch to programming activities. right now, I like having everything in one place!

jack (jack@phroa.net)
2016-10-17 22:22:17

2016 Seattle GNU/Linux conference is on November 11-12 at Seattle Central College, admission and food is free https://seagl.org

🐧 riyadth, jeff_dalton, tom_wiggin
rose_bandrowski (rose.bandrowski@gmail.com)
2016-10-19 15:29:53

@rose_bandrowski has joined the channel

jack (jack@phroa.net)
2016-10-23 22:31:00

@binnur to be honest, I only made those PRs (and not direct commits) because I wanted a free shirt

binnur (binnur.alkazily@gmail.com)
2016-10-24 06:47:51

:) we should talk about priorities ;-) hopefully it worked :)

Kate Treviño-Yoson (tenorsax@comcast.net)
2016-10-26 12:02:24

@Kate Treviño-Yoson has joined the channel

olivia_pells (oliviapells@gmail.com)
2016-10-26 18:43:32

@olivia_pells has joined the channel

jack (jack@phroa.net)
2016-10-26 19:39:27
jack (jack@phroa.net)
2016-10-26 19:39:39

@jack pinned their File to this channel.

declan_freemangleason (declanfreemangleason@gmail.com)
2016-10-26 20:16:45

@declan_freemangleason has joined the channel

jeremy_lipschutz (jdlgobears@gmail.com)
2016-10-26 20:40:24

@jeremy_lipschutz has joined the channel

adrien_chaussabel (adrien.chaussabel@gmail.com)
2016-10-26 20:52:28

@adrien_chaussabel has joined the channel

binnur (binnur.alkazily@gmail.com)
2016-10-26 21:06:14

@jack: thanks!

jack (jack@phroa.net)
2016-10-26 21:18:13

@binnur I'll write up some stuff for the handbook

binnur (binnur.alkazily@gmail.com)
2016-10-26 21:18:53

Love it! And looking forward to reading it :)

coachchee (echee@bisd303.org)
2016-10-26 21:38:57

@jack: thanks

ronan_bennett (benneron000@frogrock.org)
2016-10-26 22:44:54

@ronan_bennett has joined the channel

benjamin_soldow (soldow.ben@gmail.com)
2016-10-27 00:46:00

@benjamin_soldow has joined the channel

michael_nelson (michael.nelson1496@gmail.com)
2016-10-27 18:57:35

@michael_nelson has joined the channel

kaedric_holt (kaedholt@gmail.com)
2016-10-28 06:54:48

@kaedric_holt has joined the channel

Clio Batali (cliombatali@gmail.com)
2016-10-29 11:18:00

Fyi - the programming stuff in the robotics room has been moved closer to the door (on the right side of the safety glasses walking in - everything's labeled in drawers/cabinets)

Harper Nalley (nalleluc000@frogrock.org)
2016-10-29 22:12:39

@Harper Nalley has joined the channel

binnur (binnur.alkazily@gmail.com)
2016-10-30 09:47:49

what is the programming stuff? as a note, we should have a section in a filing cabinet for the licenses

tom_wiggin (twigginthecool@gmail.com)
2016-10-30 13:20:59

@tom_wiggin has joined the channel

Clio Batali (cliombatali@gmail.com)
2016-10-30 16:17:01

Backup computers, keyboard/mouse, jetson, lights, cameras, kinect, etc.

jack (jack@phroa.net)
2016-10-31 15:11:08

@tom_wiggin are you MotGit?

lia_johansen (lilixlucky@gmail.com)
2016-10-31 20:55:36

Yeah he is @jack

tom_wiggin (twigginthecool@gmail.com)
2016-11-01 13:18:28

I found a piece of open source software called Electric in the Ubuntu repository meant for making circuit diagrams and stuff. I heard that we needed a circuit diagram making thing this year so I thought I would help 🙂

tom_wiggin (twigginthecool@gmail.com)
2016-11-01 13:19:49

heres the website

tom_wiggin (twigginthecool@gmail.com)
2016-11-01 13:19:51
binnur (binnur.alkazily@gmail.com)
2016-11-01 22:05:06

Awesome! Thanks :)

jack (jack@phroa.net)
2016-11-01 22:26:23

more coming eventually...

tom_wiggin (twigginthecool@gmail.com)
2016-11-04 20:34:22

Why do I need a style guide?

declan_freemangleason (declanfreemangleason@gmail.com)
2016-11-04 22:05:27

So other people can read your code.

👍 Clio Batali, binnur
👌 alex_lf
jack (jack@phroa.net)
2016-11-04 23:03:55

merge conflicts

tom_wiggin (twigginthecool@gmail.com)
2016-11-05 00:42:08

Why are voice calls a paid feature?

tom_wiggin (twigginthecool@gmail.com)
2016-11-05 00:43:02

I'm sure I could set up a Mumble server for our team if we need voice communications

tom_wiggin (twigginthecool@gmail.com)
2016-11-05 00:43:52

That could be a project we could work on before kickoff!

tom_wiggin (twigginthecool@gmail.com)
2016-11-05 00:50:36

and we could use Filezilla for unlimited file transfers

tom_wiggin (twigginthecool@gmail.com)
2016-11-05 00:51:24

Do we have any server machines?

:spartronics: tom_wiggin
:godmode: tom_wiggin
jack (jack@phroa.net)
2016-11-05 06:25:46

we have each other's cell phones so normally we just call people

😕 tom_wiggin
👍 declan_freemangleason
tom_wiggin (twigginthecool@gmail.com)
2016-11-05 17:30:30

I'm waiting for my raging acne outbreak to disappear first

😡 tom_wiggin, declan_freemangleason
mrosen (michael.rosen@gmail.com)
2016-11-06 09:51:56

@mrosen has joined the channel

mrosen (michael.rosen@gmail.com)
2016-11-06 09:58:21

Has anyone tried getting the simulator "FRCSim" working? I'm following the instuctions on https://wpilib.screenstepslive.com/s/4485/m/23353/l/228979-installing-frcsim-manually . and having limited success: Gazebo comes up and renders the .world and the Robot code is clearly trying to talk to the simulator but the Robot in Gazebo isn't doing anything. I think I've got the attention of the WPI guy who did the Youtube videos so I'm optimistic about making this work. Interest in this from Programming team members?

Charlotte (charlottelf@protonmail.com)
2016-11-06 10:21:21

@Charlotte has joined the channel

jack_chapman (jwc10101@gmail.com)
2016-11-06 10:24:21

@jack_chapman has joined the channel

Kenneth Wiersema (kcw815@icloud.com)
2016-11-06 10:26:28

@Kenneth Wiersema has joined the channel

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 12:54:29

We have a simulator for that?

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 12:56:44

Why wasn't I told about this?!

💢 tom_wiggin
chris_mentzer (cmentzer@mentzer.org)
2016-11-06 13:03:48

@chris_mentzer has joined the channel

jack (jack@phroa.net)
2016-11-06 13:06:38

@tom_wiggin we've never used it before; it was just brought up by someone (don't remember who, sorry) at the SkunkWorks workshops yesterday from his own research :)

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 13:07:36

k

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 13:08:13

I get notifications on my laptop whenever someone posts anything on slack

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 13:08:19

its quite neat

jack (jack@phroa.net)
2016-11-06 13:09:11

The guy who developed it was helping present yesterday, apparently it has gone from "basically unusable" to "it's very cool if you can flick the right levers" this year

jack (jack@phroa.net)
2016-11-06 13:09:24

would be great if you could set it up

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 13:09:55

What went from unusable to somewhat usable after a sacrifice to the software gods?

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 13:10:01

was it the simulator?

jack (jack@phroa.net)
2016-11-06 13:59:08

The simulator, yeah.

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 14:32:54

ok...

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 14:33:07

I had a great idea!

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 14:33:24

we should all read the daily wtf

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 14:33:39

then maybe things like this wouldn't happen

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 15:11:45

If you read it you will know what I mean

jack (jack@phroa.net)
2016-11-06 15:20:05

okay well if you're interested in frcsim you should try to set it up, we could probably use someone who knows how to use it on the team this year

jeff_dalton (chezdalton@gmail.com)
2016-11-06 15:25:05

On Saturday, I was chatting with the Skunkworks mentor who wrote (part of team?) FRCSim when he was at WPI 2 years ago. He said performance has gotten better in 2 years, but he doesn't know of anyone that uses it to test software. It is mostly used just for demos. On the positive side, it is only loosely tired to Solidworks. Just an export tool. So there is nothing architecturally preventing its use with Fusion 360. He had no enthusiasm for using FRCSim productively as a tool. To me, it sounded like this was a school project for him that was more a demonstration of technology than a tool. That said, I've been mucking with it too, but it sounds like you're further along. I've got Gazebo running, but need to track down some libs for FRCSim.

jack (jack@phroa.net)
2016-11-06 15:25:17

most things come from Worcester Polytechnic Institute, see http://wpilib.screenstepslive.com/s/4485 for documentation on FIRST programs

jack (jack@phroa.net)
2016-11-06 15:26:18

it takes a lot of mental effort to read that

jack (jack@phroa.net)
2016-11-06 15:27:54

I don't know about direct Microsoft ties, but many teams and events are sponsored by Microsoft, and many volunteers are reimbursed by Microsoft.

jack (jack@phroa.net)
2016-11-06 15:28:29

I assume someone loosely related to Microsoft worked on FRCSim at some point, why?

tom_wiggin (twigginthecool@gmail.com)
2016-11-06 15:43:20

"only loosely tired to solidworks" you can edit messages btw

😬 tom_wiggin
tom_wiggin (twigginthecool@gmail.com)
2016-11-06 15:52:12

What is gazebo and WPI?

😕 tom_wiggin
declan_freemangleason (declanfreemangleason@gmail.com)
2016-11-06 16:20:56

WPI: https://wpi.edu Gazebo: http://gazebosim.org

wpi.edu
gazebosim.org
jack (jack@phroa.net)
2016-11-06 16:40:20
jack (jack@phroa.net)
2016-11-06 16:40:25

@jack pinned a message to this channel.

} Jack Stratton (https://spartronics.slack.com/team/U2FJQQJ95)
jack (jack@phroa.net)
2016-11-06 16:40:34

(ignore, just pinning for later)

jack (jack@phroa.net)
2016-11-06 16:47:37

we use it a lot, so get used to it :)

jack (jack@phroa.net)
2016-11-06 16:47:44

actually I should probably pin that too

jack (jack@phroa.net)
2016-11-06 16:47:58
tom_wiggin (twigginthecool@gmail.com)
2016-11-06 16:47:59

don't its obvious

😅 tom_wiggin
tom_wiggin (twigginthecool@gmail.com)
2016-11-06 16:48:04

too late...

💀 tom_wiggin
finn_mander (finn.mander@gmail.com)
2016-11-06 19:11:51

@finn_mander has joined the channel

mrosen (michael.rosen@gmail.com)
2016-11-07 20:53:29

I've spent perhaps four hours over the last two days on "frcsim." Here's what I've found.

There is a simulator FRC robots. That means you can run and debug code entirely on your laptop hardware, without a robot connected. You can even hook up an x-box controller and drive a robot around the screen.

I think it can work well enough to be useful. Here's a 4 minute youtube video of Peter Mitrano (WPI) characterizing a video-game controller and then using it to drive a simple robot: https://www.youtube.com/watch?v=SDk0TW8Xgic&t=126s. I can reproduce this demonstration on my Linux laptop.

That said, it is absolutely "not ready for primetime." It's cutting edge. Expect to bleed when you touch it. Lots of installation woes and unexpected, unexplained behavior.

But again, ... it can be done ... and if you do it, you can be writing test / exploratory programs this week, not when hardware is available (when?). This is my first year with Spartronics but I have to think that a ready test environment would be hugely valuable here. Am I right?

Does anyone want to chase this with me?

YouTube
} WPILib (https://www.youtube.com/channel/UC_xEeMJeS3UQ388lxNItmzQ)
👍 joncoonan, declan_freemangleason
💯 joncoonan, declan_freemangleason
tom_wiggin (twigginthecool@gmail.com)
2016-11-07 20:56:03

I have 7 weeks of math homework to catch up on

tom_wiggin (twigginthecool@gmail.com)
2016-11-07 20:56:41

I will help you!

😜 tom_wiggin
whobbs1496 (whobbs1496@gmail.com)
2016-11-07 21:11:36

@whobbs1496 has joined the channel

mrosen (michael.rosen@gmail.com)
2016-11-07 23:08:58

@mrosen has left the channel

mrosen (michael.rosen@gmail.com)
2016-11-07 23:10:31

@mrosen has joined the channel

connor_weiss (conrad.t.weiss@gmail.com)
2016-11-09 18:38:34

@connor_weiss has joined the channel

brian_hutchison (savingpvtbrian7@gmail.com)
2016-11-09 20:08:03

@brian_hutchison has joined the channel

sholzer (sholzerpie@gmail.com)
2016-11-09 21:05:46

@sholzer has joined the channel

lia_johansen (lilixlucky@gmail.com)
2016-11-09 21:45:47

Hey everyone. I have sent an email out that lists the homework. Please read the "Robot programming" and "WPILib overview" on the developers handbook. We will have a short quiz next meeting (11/16/16 3-5 pm). Also if you have not emailed me your github username, please do so soon.

👍 Clio Batali, coachchee
jack (jack@phroa.net)
2016-11-09 23:23:35

I'm going to be about 15 minutes late to every meeting that starts at 3:00 fyi

lia_johansen (lilixlucky@gmail.com)
2016-11-10 07:05:26

@jack: thanks for letting us know

riyadth (riyadth@gmail.com)
2016-11-10 08:51:14

I went over the robot control system in a bit of a hurry yesterday. I recommend you all check out the FRC documentation on the control system, particularly the hardware overview: https://wpilib.screenstepslive.com/s/4485/m/24166/l/144968-2016-frc-control-system-hardware-overview

👍 lia_johansen, Clio Batali
riyadth (riyadth@gmail.com)
2016-11-10 08:52:17

And remember, that document is part of a treasure trove of information about the overall control system and how to use it: https://wpilib.screenstepslive.com/s/4485

Sam Rosen (rosensam000@frogrock.org)
2016-11-10 12:19:04

@Sam Rosen has joined the channel

marie_sachs (marietsachs@gmail.com)
2016-11-10 17:35:40

@marie_sachs has joined the channel

lia_johansen (lilixlucky@gmail.com)
2016-11-14 08:12:31

Hey everyone. Just a reminder not all of you have emailed me your github account. If that is you please do so and email me. Thanks

lia_johansen (lilixlucky@gmail.com)
2016-11-15 08:23:26

This is a reminder to read "robot programming" and "WPIlib overview" from the developers handbook on github for tomorrows meeting. We will be having a quick quiz. Thanks

niklas_pruen (niklas.pruen@gmail.com)
2016-11-16 10:40:37

@niklas_pruen has joined the channel

dana_batali (dana.batali@gmail.com)
2016-11-16 14:41:22

Here is a link to a slide-deck I created called “Intro to FRC programming”. It overlaps with the developer’s handbook, but also has links to example programs and programming exercises. It’s a work in progress and might take a few reads to fully digest. https://docs.google.com/presentation/d/1ZiMBC9y3xrwFk1akdaiV_BMLLS6EyY6BSfiTRQo1KlM/edit?usp=sharing

dana_batali (dana.batali@gmail.com)
2016-11-16 14:41:23
dana_batali (dana.batali@gmail.com)
2016-11-17 12:29:41

@dana_batali pinned their GSuite Presentation Intro to FRC Programming to this channel.

tom_wiggin (twigginthecool@gmail.com)
2016-11-17 19:13:19

marketing is having a heated discussion about team uniforms btw

john_sachs (johncsachs@gmail.com)
2016-11-17 20:04:46

@john_sachs has joined the channel

michael_nelson (michael.nelson1496@gmail.com)
2016-11-20 15:22:31

@lia_johansen: what's your email, I think all the emails are marked as junk or spam.

lia_johansen (lilixlucky@gmail.com)
2016-11-20 15:25:29

@michael_nelson: lilixlucky@gmail.com

tom_wiggin (twigginthecool@gmail.com)
2016-11-20 16:47:51

if you need to see someones email address check their profile

noah_martin (2013islandboy@gmail.com)
2016-11-30 18:52:49

@noah_martin has joined the channel

tom_wiggin (twigginthecool@gmail.com)
2016-11-30 21:24:23

Sorry I couldn't join the november 30th meeting my mother was sick and couldn't drive me to school

😧 tom_wiggin
lia_johansen (lilixlucky@gmail.com)
2016-12-01 16:08:45

Hey Everyone,

For our next meeting (12/14/16) please read slides #1-38 on the slideshow linked below. This will help you get a better understanding of FRC programming.

https://docs.google.com/presentation/d/1ZiMBC9y3xrwFk1akdaiV_BMLLS6EyY6BSfiTRQo1KlM/edit#slide=id.p

tom_wiggin (twigginthecool@gmail.com)
2016-12-02 17:15:22

what does the clone command in git do?

dana_batali (dana.batali@gmail.com)
2016-12-02 17:41:50

clone is the way to bootstrap one repository given another. Typically you do this once per repository. In the slides referenced above, there’s a work-in-progress git tutorial (starting at page 65) that describes these init steps:

  1. create github account
  2. fork a spartronics repository into your github account
  3. clone your fork onto your development machine
  4. start working on the clone of the fork

As you might imagine, fork and clone are closely related.

tom_wiggin (twigginthecool@gmail.com)
2016-12-02 17:44:08

nevermind I found the documentation

tom_wiggin (twigginthecool@gmail.com)
2016-12-02 17:44:23

thanks

dana_batali (dana.batali@gmail.com)
2016-12-02 17:45:44

btw: programmers shouldn’t worry about git until they complete exercise 1 and 2. This channel is a good place for questions and answers. If you have any questions, fire away! And if you’ve already finished these exercises, please chime in and help other team-members. Thanks!

rose_bandrowski (rose.bandrowski@gmail.com)
2016-12-07 06:54:31

http://www.bainbridgereview.com/life/things-get-robotic-this-wednesday-at-westside-pizza/ spread the word!!!

Bainbridge Island Review
finn_mander (finn.mander@gmail.com)
2016-12-08 17:38:34

Hope it's ok that I ask here: Does anyone have an hdmi capture card that I can use at FLL? They are typically used with recording video games.

tom_wiggin (twigginthecool@gmail.com)
2016-12-08 17:39:06

I wish

tom_wiggin (twigginthecool@gmail.com)
2016-12-08 17:39:16

sorry but no I don't have one

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 17:44:08

@finn_mander: You can't use software recording like OBS?

alex_lf (al3xlf@gmail.com)
2016-12-08 17:44:35

pretty sure you need a capture card if you're using a separate device to stream

finn_mander (finn.mander@gmail.com)
2016-12-08 17:45:18

I need an interface to connect the hdmi input. Computers output hdmi but can't input it without a capture card from what i've read. Thanks for your response!

finn_mander (finn.mander@gmail.com)
2016-12-08 17:45:34

No worries, thanks Tom

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 17:45:48

Yeah, you will need a capture card then

alex_lf (al3xlf@gmail.com)
2016-12-08 17:46:28

It's worth a try, it seems there is some software you can use

alex_lf (al3xlf@gmail.com)
2016-12-08 17:48:27

ah never mind you need to do it through a network

joncoonan (jonathancoonan@gmail.com)
2016-12-08 17:49:29

The school network is not nearly fast enough FYI - we did that last year and it lagged by about 3 seconds @ 600 x 400 resolution

👎 alex_lf
😱 joncoonan
declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 17:50:33

I have a network switch, so you could potentially connect them all through ethernet

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 17:50:48

Although that situation is less than optimal

joncoonan (jonathancoonan@gmail.com)
2016-12-08 17:51:04

There are no ethernet ports in the gym

joncoonan (jonathancoonan@gmail.com)
2016-12-08 17:51:13

And that is a long cord anyways

alex_lf (al3xlf@gmail.com)
2016-12-08 17:51:33

it would just need to be between the two computers I think

joncoonan (jonathancoonan@gmail.com)
2016-12-08 17:51:49

If I understand Finn’s setup correctly

joncoonan (jonathancoonan@gmail.com)
2016-12-08 17:51:53

It goes like this

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 17:51:54

Yeah, that's the point of the switch

👍 alex_lf
declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 17:52:07

Like a mini network to communicate on

alex_lf (al3xlf@gmail.com)
2016-12-08 17:52:44

either way the problem with the software is that it's closer to remote desktop and not quite what we want

alex_lf (al3xlf@gmail.com)
2016-12-08 17:52:52

There are a couple other options though

alex_lf (al3xlf@gmail.com)
2016-12-08 17:54:46

alright this one looks promising: http://spacedesk.ph/

spacedesk - Multi Monitor alternative to MaxiVista, Duet Display and Air Display
declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 18:05:03

@finnmander What is the actual setup you have for this like? _(Sorry if you got double mentioned)

tom_wiggin (twigginthecool@gmail.com)
2016-12-08 20:07:35

or we could just use remote login?

tom_wiggin (twigginthecool@gmail.com)
2016-12-08 20:07:46

what do we need this for?

tom_wiggin (twigginthecool@gmail.com)
2016-12-08 20:09:35

live stream over LAN right?

tom_wiggin (twigginthecool@gmail.com)
2016-12-08 20:09:51

VLC has built in network streaming

finn_mander (finn.mander@gmail.com)
2016-12-08 20:22:40

It wouldnt actually be connected to the internet. The screen on the computer would be displaying the camera feed live. We would just have the projector switch between two sources: the computer with the camera, and the score computer

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 20:26:58

Can you connect the camera to the score computer?

alex_lf (al3xlf@gmail.com)
2016-12-08 20:28:04

you could probably just switch the input for the score projector

alex_lf (al3xlf@gmail.com)
2016-12-08 20:28:14

from the camera to the score computer

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 20:30:55

@finn_mander: Switching the input might work well

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 20:31:06

I think that's a good idea

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 20:31:23

Although there is a bit of a delay when switching input

finn_mander (finn.mander@gmail.com)
2016-12-08 20:31:57

Yeah but it is an easy way to switch between sources. Besides, we don't have a video switcher available to us

finn_mander (finn.mander@gmail.com)
2016-12-08 20:32:12

It worked pretty well last year. The delay was maybe 2-3 seconds

finn_mander (finn.mander@gmail.com)
2016-12-08 20:32:46

@alex_lf: yeah that's what I was hoping to do

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 20:41:53

@finnmander: _If you can directly connect the camera to the scoring computer then you can use something like OBS (obsproject.org) to switch seamlessly or on a timer, and then you can fullscreen it in preview mode on the projector. That's the only other idea I have, although it might be more trouble than it's worth. We'll see soon, I suppose.

jack (jack@phroa.net)
2016-12-08 20:42:45

the original issue was that he didn't have a capture card to do that :)

declan_freemangleason (declanfreemangleason@gmail.com)
2016-12-08 20:50:44

Well, that's the if. I was under the impression that there were two computers and inferred that if the camera was able to connect to one computer than it would be able to connect to the other. Honestly though, I don't really have enough information about the situation to provide very helpful advice, which is why I'm not going to provide any more of it unsolicited.

dana_batali (dana.batali@gmail.com)
2016-12-09 08:32:50

The axis cameras we have for the robot can stream their output to a computer via a Lan connection (ie via a hub/router)... Not sure if that's of any use for Finn's setup

finn_mander (finn.mander@gmail.com)
2016-12-09 09:35:06

Thanks everyone for your help! I think I may have been overcomplicating the setup. We may be able to simply run an hdmi cable from the camera to the projector and then a hdmi cable from the scoring computer to the projector. Declan you have a great point. Using that software would make a great transition, though I'm not sure I'll be able to find access to a capture card in time to use that. Thanks for the idea Dana. I would like to avoid network or lan connections since unfortunately the school internet had a really low bitrate.

finn_mander (finn.mander@gmail.com)
2016-12-09 09:35:10

has**

tom_wiggin (twigginthecool@gmail.com)
2016-12-09 09:51:30

the BYOD network has an awful bitrate

tom_wiggin (twigginthecool@gmail.com)
2016-12-09 09:51:40

they regular one doesn't

tom_wiggin (twigginthecool@gmail.com)
2016-12-09 09:51:43

also

tom_wiggin (twigginthecool@gmail.com)
2016-12-09 09:52:04

ask everyone to stop using the network while you are doing it

dana_batali (dana.batali@gmail.com)
2016-12-09 10:57:17

if you go straight through a router you don’t need to access the byod network at all

dana_batali (dana.batali@gmail.com)
2016-12-09 10:57:26

but best to keep it simple

tom_wiggin (twigginthecool@gmail.com)
2016-12-09 17:47:50

whats the point of all the fancy security if you can just access it directly through the router?

dana_batali (dana.batali@gmail.com)
2016-12-09 19:06:26

Tom - one can connect multiple computers via one network without going through another network. I assume that you are referring to the BHS network security? That is justified to ensure that the internet isn’t broadly available. In our case, we don’t need access to the internet, we just need one computer to talk to another, ie be in the same network. So we don’t need access to BHS network & the internet.

tom_wiggin (twigginthecool@gmail.com)
2016-12-11 20:08:43

?

jack (jack@phroa.net)
2016-12-11 20:10:50

in the end, they just used two video cables and the button on the projector remote to switch inputs. (until we got a second projector, anyway)

dana_batali (dana.batali@gmail.com)
2016-12-15 08:35:08

Hey programmers - lots of great progress last night! If you get a chance, please do try to work through the examples on your own time. Questions can be posted to this channel! Also, we'll be adding new examples to the set as time allows, so do check in on the slide-set as time permits.

michael_nelson (michael.nelson1496@gmail.com)
2016-12-17 12:39:35

@michael_nelson has left the channel

tom_wiggin (twigginthecool@gmail.com)
2016-12-18 15:23:41

Haven't had time to read anything because of christmas celebrations

tom_wiggin (twigginthecool@gmail.com)
2016-12-18 15:25:58

btw I assumed you meant you could just plug in with an ethernet cable into the router and get internet and printer access without authorization

jack (jack@phroa.net)
2016-12-18 15:28:05

ok please read things now that you do :)

jack (jack@phroa.net)
2016-12-18 15:28:20

we made a lot of progress, important for everyone to catch up

michael_nelson (michael.nelson1496@gmail.com)
2016-12-20 00:33:02

@michael_nelson has joined the channel

tom_wiggin (twigginthecool@gmail.com)
2016-12-22 13:53:13

what is an application stack?

jack (jack@phroa.net)
2016-12-22 15:03:37

just in general?

tom_wiggin (twigginthecool@gmail.com)
2016-12-22 15:28:10

what would you use one for?

jack (jack@phroa.net)
2016-12-23 13:27:38

it's just the list of technologies your application uses. the os, database layer, server framework, client framework... mostly called a stack in web development

tom_wiggin (twigginthecool@gmail.com)
2016-12-23 16:11:02

still confused

tom_wiggin (twigginthecool@gmail.com)
2016-12-23 16:11:10

whats a database layer?

jack (jack@phroa.net)
2016-12-23 16:41:45

most business programs use other programs to store and fetch data since they're designed for doing it well

tom_wiggin (twigginthecool@gmail.com)
2016-12-23 20:04:52

server framework?

jack (jack@phroa.net)
2016-12-23 20:07:32

I know you were learning Ruby, are you familiar with Rails/Sinatra? (even the concept behind it, not necessarily Rails itself)

jack (jack@phroa.net)
2016-12-23 20:08:06

better analogy: WPILib

tom_wiggin (twigginthecool@gmail.com)
2016-12-23 20:08:30

I gave up a quarter of the way through and did something else

tom_wiggin (twigginthecool@gmail.com)
2016-12-23 20:08:39

we are using java right?

jack (jack@phroa.net)
2016-12-23 20:08:47

a quarter of the way through Dana's presentation?

tom_wiggin (twigginthecool@gmail.com)
2016-12-23 20:08:58

no through the ruby guide

tom_wiggin (twigginthecool@gmail.com)
2016-12-23 20:09:16

it was wirtten for a way older version of ruby

tom_wiggin (twigginthecool@gmail.com)
2016-12-23 20:09:34

oh wpilib?

jack (jack@phroa.net)
2016-12-23 20:14:48

https://ruby-community.com/pages/links

ruby-community.com
tom_wiggin (twigginthecool@gmail.com)
2016-12-23 20:21:20

there we go

jack (jack@phroa.net)
2016-12-24 21:31:31

@jack pinned a message to this channel.

} Lia Johansen (https://spartronics.slack.com/team/U29C4U223)
tom_wiggin (twigginthecool@gmail.com)
2016-12-25 15:56:29

do we have a google classroom?

james (james@slattery.tech)
2016-12-26 16:00:15

I don't think so

tom_wiggin (twigginthecool@gmail.com)
2016-12-26 18:47:01

k

niklas_pruen (niklas.pruen@gmail.com)
2017-01-04 18:13:29

will we need a computer today or is it just the kickoff training

Clio Batali (cliombatali@gmail.com)
2017-01-04 18:14:03

Nope, just kickoff!

brian_hilst (brian@hilst.org)
2017-01-09 21:34:00

@brian_hilst has joined the channel

lia_johansen (lilixlucky@gmail.com)
2017-01-12 15:47:50

Hey everyone, please bring your engineering notebooks (per usual) for brainstorming tomorrow

👍 timo_lahtinen
👌 timo_lahtinen
✌ timo_lahtinen
:fleur_de_lis: timo_lahtinen
:octocat: timo_lahtinen
:clio: timo_lahtinen, declan_freemangleason, Clio Batali
declan_freemangleason (declanfreemangleason@gmail.com)
2017-01-13 17:12:01

If anyone is interested in the thread I posted asking about light sensors for detecting the gaffers tape in autonomous, here it is: https://www.chiefdelphi.com/forums/showthread.php?t=153561

chiefdelphi.com
lia_johansen (lilixlucky@gmail.com)
2017-01-13 19:12:32
riyadth (riyadth@gmail.com)
2017-01-13 21:07:16

@declan_freemangleason That is a good thread, and it's already getting some interesting responses. One thing we may need to consider is how we determine which color tape we are looking for. I seem to remember that once autonomous starts, the FMS informs the robot of alliance color via the network tables. That could be a more reliable way to look for the correct color than a setting made on the driver station (where it could be accidentally set to the wrong color).

riyadth (riyadth@gmail.com)
2017-01-13 21:09:18

Maybe we should consider using blue and red LEDs near the sensor to help identify the tape color (that is, illuminate the area under the sensor with light the same color as the tape we are looking for -- that might increase the amount of light reflected by the tape, relative to the carpet). Plus it would look cool to shine blue or red light out from under our robot...

declan_freemangleason (declanfreemangleason@gmail.com)
2017-01-13 21:11:09

Yeah, I think that would look really cool, and probably work well. If we put two color sensors on the robot then we could even determine the angle of the lines.

jack (jack@phroa.net)
2017-01-14 21:05:29

bring your notebook tomorrow

👍 lia_johansen, Clio Batali
jack (jack@phroa.net)
2017-01-15 13:52:35

git: https://git.io/vMw8y

GitHub
jack (jack@phroa.net)
2017-01-15 13:52:42

@jack pinned a message to this channel.

} Jack Stratton (https://spartronics.slack.com/team/U2FJQQJ95)
declan_freemangleason (declanfreemangleason@gmail.com)
2017-01-15 14:15:13

Here is the link to the CTRE Toolsuite which includes the javadoc and the actual library: http://www.ctr-electronics.com/control-system/hro.html#product_tabs_technical_resources

riyadth (riyadth@gmail.com)
2017-01-15 14:52:23

Link to Screensteps WPIlib control system pages: https://wpilib.screenstepslive.com/s/4485

dana_batali (dana.batali@gmail.com)
2017-01-15 15:24:25

serial number for installing FRC Control System (National Instruments driver station): M82X13758 (serial number came with KOP))

dana_batali (dana.batali@gmail.com)
2017-01-15 15:25:20

(only install this if you have windows and think you need driver station on your computer)... This isn't needed for mainstream development

declan_freemangleason (declanfreemangleason@gmail.com)
2017-01-15 15:37:46

Here is the list of sensors we're interested in that I sent to Clio:

Distance Sensor (Different types?) Color Sensor (Related thread: https://www.chiefdelphi.com/forums/showthread.php?t=153561, get a few kinds) Camera that Dana Mentioned (https://www.amazon.com/Pixy-CMUcam5-Smart-Vision-Sensor/dp/B00IUYUA80)

chiefdelphi.com
amazon.com
Clio Batali (cliombatali@gmail.com)
2017-01-15 15:49:48

http://www.schneider-electric.co.uk/en/faqs/FA142566/ Difference between PNP and NPN. Worth checking out!

adrianna_carter (adriannascarter@gmail.com)
2017-01-15 16:40:41

@adrianna_carter has joined the channel

coachchee (echee@bisd303.org)
2017-01-15 17:59:04

Please. Give list to Robert for me to purchase . Ask captains to explain how we make orders .

riyadth (riyadth@gmail.com)
2017-01-15 19:32:38

I have found some information on the Lego NXT color sensor, and I am pretty sure we COULD make use of it on our robot, but it might be a bit of work. Here is a link to what I found: https://www.wayneandlayne.com/bricktronics/design-and-theory/#sensor_color

One thing we need to understand is how it works. It is basically a light sensor (for white light, which is a mixture of red, green and blue), and includes three LEDs (red, green and blue) to illuminate the thing being sensed. To determine the color, the sensor turns on each LED in turn, and measures the amount of light reflected back. It can send back the color as three numbers representing the red, green and blue content. Note that it must be calibrated in order to work correctly.

One challenge we will have is how it works while we are moving. Since three measurements must be taken to determine the color, we would need the sensor to be over the tape long enough to take at least one reading of each color, and we actually have to be over the tape for at least double that time because we don't know when the measurement cycle starts.

The article didn't say anything about the timing of the cycle in the sensor, so I don't know if it will work. But we should be able to determine how much time we have if we estimate our robot's speed, and use that to determine how many milliseconds it takes to cross the tape on the field. Does anyone in the group know how fast our robot will go? Or how wide the tape is?

riyadth (riyadth@gmail.com)
2017-01-15 19:38:37

More information on the Lego color sensor. This page shows a test that determined it takes 2.5ms to make a color reading: http://www.philohome.com/colcomp/cc.htm

riyadth (riyadth@gmail.com)
2017-01-15 19:39:03

But that page also seems to indicate the sensor doesn't use I2C interfacing, contrary to the previous page...

riyadth (riyadth@gmail.com)
2017-01-15 19:52:59

This sensor looks promising, but I can't tell how far away from the tape we can put it: https://www.adafruit.com/products/1334

adafruit.com
PRICE
$7.95 USD
STOCK
IN STOCK
riyadth (riyadth@gmail.com)
2017-01-15 19:58:35

But it also takes longer to read the color. Looks like a minimum of 2.4ms, and up to 700ms for more accuracy.

chrisrin (chrisrin@microsoft.com)
2017-01-15 19:59:01

I recall one of the Microsoft mentors said they have had good luck with allen bradley sensors. http://ab.rockwellautomation.com/Sensors-Switches/Color-and-Contrast-Photoelectric-Sensors

riyadth (riyadth@gmail.com)
2017-01-15 20:09:33

They look nice. I think this model might be nice: 45CLR-5JPC1-D8. But the only place I could see a price for it wanted $630, which is definitely outside our price range.

chrisrin (chrisrin@microsoft.com)
2017-01-15 20:42:42

Wow! Maybe they were talking about a different kind of sensor (I think he said something like "3-beam")

riyadth (riyadth@gmail.com)
2017-01-15 20:48:48

The one I pointed to has 3 PNP outputs, and you program it with buttons to set an individual output high when it recognizes a specific color. So you point it at the color, push some buttons, and it learns the color. We would just have to look at a digital input for "RED" or "BLUE" colors, and when the input goes high then we are at the color target. It can even work up to 3cm away from the target, which is nice, since we don't want it too close to the ground in case it gets stuck on a loose bit of carpet or something.

riyadth (riyadth@gmail.com)
2017-01-15 20:49:02

I suspect it can be had cheaper, but I don't know where.

jack (jack@phroa.net)
2017-01-15 21:11:21

wow, that sort of thing is really expensive http://www.newark.com/color-contrast-photoelectric-sensors?searchRef=SearchLookAhead

newark.com
jack (jack@phroa.net)
2017-01-15 21:13:44

never mind, that was the "fancy" category. http://www.newark.com/vishay/veml6040a3og/rgbw-colour-sensor-digital-o-p/dp/41Y9252

newark.com
jack (jack@phroa.net)
2017-01-15 21:14:30

it's nice that the tape is either white, red, or blue, and the carpet is green. perfect match for the leads

riyadth (riyadth@gmail.com)
2017-01-15 21:52:06

That Vishay sensor is just a chip. We probably need something fancier than that, but maybe not as fancy as the expensive stuff...

jack (jack@phroa.net)
2017-01-16 18:52:04

Everyone using Eclipse: in the repository is an extra folder. Go to Window -&gt; Preferences -&gt; Java -&gt; Code Style in Eclipse. Match the .xml files in extra with the tabs labeled Formatter and Clean Up -- import them, set as active, ok, ok, ok

Now, every time you make a change, right click on the project on the left side of eclipse and go to Source -&gt; Format and Source -&gt; Clean Up...

binnur (binnur.alkazily@gmail.com)
2017-01-17 20:57:28

@jack nice job!! walking through your repo setup and your git presentation (and thank you, in regards to the placement of {} 🙂 )

jack (jack@phroa.net)
2017-01-17 20:58:00

it was more of a visual aid to a mostly spoken presentation, but if anyone wants the material as reference, hey, there it is :)

binnur (binnur.alkazily@gmail.com)
2017-01-17 21:00:30

next is my favorite question — in the repo — I mean dashboard, can we track version number (can’t see your prior code…)

binnur (binnur.alkazily@gmail.com)
2017-01-17 21:02:19

^^^ (clarified, I hope — Riyadth said I didn’t make sense :)

riyadth (riyadth@gmail.com)
2017-01-17 21:24:15

Details of the actual field, in photographs.

} Riyadth Al-Kazily (https://spartronics.slack.com/team/riyadth)
binnur (binnur.alkazily@gmail.com)
2017-01-17 21:27:04

^^^ autonomous team - good resource to checkout for any ideas

jack (jack@phroa.net)
2017-01-17 21:28:18

binnur: yeah, I was thinking I'd find someone to copy over last year's buildsystem with

jack (jack@phroa.net)
2017-01-17 21:28:23

**at tomorrow's meeting

binnur (binnur.alkazily@gmail.com)
2017-01-17 21:29:05

@jack that is an awesome idea!!! 🙂 please lets make sure there is also a good readme generated in the process for next year!

declan_freemangleason (declanfreemangleason@gmail.com)
2017-01-17 21:31:37

@jack: if you end up doing Continuous Integration you can have that tag commits on GitHub with said version numbers

jack (jack@phroa.net)
2017-01-17 21:32:27

I finished setting up Travis yesterday, but what we're talking about is having the local buildsystem copy the current git revision to $location on the robot so that it's displayed on the driver station along with who built it and when

jack (jack@phroa.net)
2017-01-17 21:33:11

last year we had ant copy some variables to the jar manifest, and read them back. it worked on... 30% of the programmers' computers?

jack (jack@phroa.net)
2017-01-17 21:34:27

we need to either fix it or try something else

binnur (binnur.alkazily@gmail.com)
2017-01-17 21:35:48

seemed to work whenever I was looking for it 🙂 or, I just remember the good stuff 😉

declan_freemangleason (declanfreemangleason@gmail.com)
2017-01-17 21:55:53

@jack: I wasn't offering a solution to that, just putting a thought out there I had relating to versioning in regard to Travis.

coachchee (echee@bisd303.org)
2017-01-17 21:59:34

Who is Travis ? Gabe ?

coachchee (echee@bisd303.org)
2017-01-17 22:00:36

or what ?

jack (jack@phroa.net)
2017-01-17 22:05:06

program to automatically build code on github

jack (jack@phroa.net)
2017-01-17 22:09:45

just makes sure that the code on github doesn't have any broken files

coachchee (echee@bisd303.org)
2017-01-17 22:42:55

Thanks !

timo_lahtinen (timolahtinen1@gmail.com)
2017-01-18 16:54:57
timo_lahtinen (timolahtinen1@gmail.com)
2017-01-18 16:55:05

@timo_lahtinen pinned a message to this channel.

} Timo Lahtinen (https://spartronics.slack.com/team/U297GL279)
timo_lahtinen (timolahtinen1@gmail.com)
2017-01-18 19:12:23
timo_lahtinen (timolahtinen1@gmail.com)
2017-01-18 19:12:30

@timo_lahtinen pinned a message to this channel.

} Timo Lahtinen (https://spartronics.slack.com/team/U297GL279)
binnur (binnur.alkazily@gmail.com)
2017-01-18 20:52:46

@coachchee I got a Travis in the office — confused the heck out of me first I read it 🙂

riyadth (riyadth@gmail.com)
2017-01-18 21:08:18

Programming team: Awesome work today! I am very impressed how the team has kept on task, got the robot moving, and enabled testing of the launch module! We're ahead of where I thought we'd be by now, which is great. We can work on adding useful features, and making our code robust and easy to debug.

I would appreciate it if every feature team implements some interface on the smart dashboard, either to indicate the status of their module (such as running or not running, or if a jam is detected), or to allow the driver to input a new parameter (such as the speed that the launcher or intake motor is running at). This will help the mechanics test their work, and make it easier to tune the robot for accuracy.

The good news is that we can combine our efforts, and use similar code for each of the features. I recommend everyone do a little research on using the smart dashboard for input and output with their commands, and we can brainstorm the best ways to do it for all the modules.

riyadth (riyadth@gmail.com)
2017-01-18 21:08:33

I'm already looking forward to next time!

binnur (binnur.alkazily@gmail.com)
2017-01-18 21:13:33

I’ll give a shout out for the SmartDashboard Test Mode — it seems somewhat limited (doesn’t seem to have support for talon srx motors), BUT if we can get it done right, it is a great way to verify electronics work without wondering if your code is the problem... Here is the link: http://wpilib.screenstepslive.com/s/4485/m/26401/c/92707

binnur (binnur.alkazily@gmail.com)
2017-01-18 21:14:17

And, based on last year’s experiment, I believe we are sticking to SmartDashboard (not the SFX v2 version). @danabatali @liajohansen @timo_lahtinen please validate.

jack (jack@phroa.net)
2017-01-18 21:19:38

One thing making us consider SFX is having a big green/red square to indicate whether the intake was on, as our current version is a toggle button on a joystick. (Though depending on drivers, that might be changed)

jack (jack@phroa.net)
2017-01-18 21:19:56

graphics like that are apparently easier

binnur (binnur.alkazily@gmail.com)
2017-01-18 21:25:39

K - if that is the case, I recommend starting to be familiar with it sooner than later — setup a user workflow that everyone will follow, that will setup the final driver station layout as we actively develop.

dana_batali (dana.batali@gmail.com)
2017-01-19 09:23:21

last year, we were pushed toward second-gen smart dashboard, because first-gen didn't support two camera feeds. Additionally there were widgets in the second gen that were sexier (graphs, etc). That said, second-gen caused more problems than it solved. If we can use 1st gen, we should stick with that. If that's not sufficient, we should look at rolling our own, probably via a web/javascript interface. I have links to other teams githubs that have followed this path.

dana_batali (dana.batali@gmail.com)
2017-01-19 11:02:53

http://www.ctr-electronics.com/Talon%20SRX%20Software%20Reference%20Manual.pdf

section 12.4: Velocity Closed-Loop Walkthrough – Java

dana_batali (dana.batali@gmail.com)
2017-01-19 12:10:33

On the topic of CANTalon in speed/velocity control mode... The question all programmers need to consider: what are the units we pass via motor.set() when in this mode? This answer depends entirely on the combination of the CANTalon library conventions AND the quad-encoder's CPM value (this is discussed in the electronic-pneumatics channel). Here's the pivotal table from the CANTalon documentation (table 17.2.2) in the manual above

dana_batali (dana.batali@gmail.com)
2017-01-19 12:13:07

API requirements and Native units

dana_batali (dana.batali@gmail.com)
2017-01-19 12:17:17
dana_batali (dana.batali@gmail.com)
2017-01-19 12:26:19

Another topic: Riyadth & Binnur suggested that we start providing information to the smart dashboard. For the first exercise, I suggest that all Subsystems populate the smart dashboard with the result of the initialization (success or failure). This should be done in your subsystem's constructor and is as simple as this (pseudo code follows):

try { ... initialization stuff.. } catch { m_initialized = false; }

SmartDashboard.putString("Drivetrain Subsystem", m_initialized ? "initialized" : "disabled");

dana_batali (dana.batali@gmail.com)
2017-01-19 12:27:20

Note: this will not appear on the smart dashboard until the named field (here: 'Drivetrain Subsytem') has been manually added to the smartdashboard on the driver station.

tom_wiggin (twigginthecool@gmail.com)
2017-01-19 21:25:36

I'm really sorry I missed all the meetings and I feel awful

tom_wiggin (twigginthecool@gmail.com)
2017-01-19 21:26:34

both my computers and my phone were confiscated for not doing homework and I was absolutely swamped

tom_wiggin (twigginthecool@gmail.com)
2017-01-19 21:28:45

I missed the eclipse setup and the git tutorial according to Jack

tom_wiggin (twigginthecool@gmail.com)
2017-01-19 21:30:06

I calculated that if you converted all the missing google classroom assignments I have missing to paper you would have a tower 20 feet high or enough to fill a small closet

riyadth (riyadth@gmail.com)
2017-01-20 12:53:21

Programmers, please join the Slack channels that are related to the subsystem you are working on (if you're not already there...). Those channels are a great place to discuss features and capabilities with the mechanical and electrical teams. I see #intake #launcher and #agitator are ready to go. (And I think #agitator may be related to #launcher, so join them both!)

:clio: Clio Batali
👍 lia_johansen
riyadth (riyadth@gmail.com)
2017-01-20 12:53:56

Do any of you on the drivetrain think we need a Slack channel for that? If so, you should ask Clio or one of the other leaders to set it up.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-01-20 15:03:58

@riyadth: I don't think we need a channel for that

dana_batali (dana.batali@gmail.com)
2017-01-20 15:06:12
} Riyadth Al-Kazily (https://spartronics.slack.com/team/riyadth)
tom_wiggin (twigginthecool@gmail.com)
2017-01-20 17:10:02

Today I learned that when a piece of software is overloaded and spends the majority of it's time switching between threads that it is "thrashing"

tom_wiggin (twigginthecool@gmail.com)
2017-01-20 17:10:04

neato

tom_wiggin (twigginthecool@gmail.com)
2017-01-20 17:26:32

https://xkcd.com/1597/

} xkcd (http://xkcd.com/)
tom_wiggin (twigginthecool@gmail.com)
2017-01-20 17:27:15

git

tom_wiggin (twigginthecool@gmail.com)
2017-01-20 17:27:25

made by a bunch of gits for gits

tom_wiggin (twigginthecool@gmail.com)
2017-01-20 17:27:37

🤓

timo_lahtinen (timolahtinen1@gmail.com)
2017-01-20 17:47:01
timo_lahtinen (timolahtinen1@gmail.com)
2017-01-20 17:47:09

@timo_lahtinen pinned a message to this channel.

} Timo Lahtinen (https://spartronics.slack.com/team/U297GL279)
tom_wiggin (twigginthecool@gmail.com)
2017-01-20 17:55:55

:rube:

dana_batali (dana.batali@gmail.com)
2017-01-21 10:17:41

I wonder if it makes sense to migrate the GitHub traffic to another channel, say programming_git, so all the checkins don't get in the way of human conversations.... @jack , what say you?

jack (jack@phroa.net)
2017-01-21 10:51:05

that integration is only set up for the 2016 repo, but it's easily (re)moved as long as people think it's useful enough to have a team captain create a new channel

jack (jack@phroa.net)
2017-01-21 10:51:44

(in my opinion the more valuable traffic is "travis build failed for pull request x", which can also be done but not by myself anymore)

dana_batali (dana.batali@gmail.com)
2017-01-21 14:22:58

programmers: if you are interested in seeing checkins and build status messages go by, wander over to the newly created programming_git channel and join-in.

dana_batali (dana.batali@gmail.com)
2017-01-21 14:29:29

Regarding proper configuration of your build environments on windows, I found that I needed to set some environment variables. On windows you do this via the System control panel -> Advanced system settings -> Environment Variables.

** make sure that JAVAHOME is present and points to your jdk install. It might look something like this: C:\Program Files\Java\jdk1.8.091

** make sure that PATH has an entry that points to Git.. On my machine that looks like this: C:\Program Files\Git\bin

dana_batali (dana.batali@gmail.com)
2017-01-21 14:30:02

the second setting helps the newly introduce build stamp be more descriptive.

dana_batali (dana.batali@gmail.com)
2017-01-21 14:51:02

To programmers of Commands and CommandGroups. Here are some tips to consider when designing your commands:

** it is recommended that all commands that operate on only a single subsystem follow the naming convention that they start with the subsystem name. Thus: all intake commands should start with Intake, all Drivetrain command with Drive, etc.

  • don't rely too heavily on last year's code for structural examples. They are generally too promiscuous (ie: they don't keep private things private).

  • consider adding methods to your subsystem that will be shared across multiple commands. Don't break the encapsulation by allowing commands to directly manipulate the motors, but rather make abstractions. For example, the Drivetrain can have a method called driveStraight, parameterized by some notion of power or speed. Today's version of the DriveTrain as the DriveTicksCommand are good references.

  • make sure to implement and think through your command's "isFinished" method. Adding debug-level logging to your command should help you to ensure that the lifetime of your command matches your expectations.

  • to help with your intuitions, i recommend that you order the methods of your command according to this lifecyle (initialize, execute, isFinished, interrupted, end)

  • make sure all your subsystem's methods check for initialized(). If this is done, then your commands should never need to make this check.

  • if you have state-change commands, you should consider parameterizing a single command rather than implementing the same logic multiple times... For example, right now, the difference between IntakeOn, IntakeOff and IntakeReverse doesn't seem to justify three different commands and three files. Please refer to the exampleRobot: https://github.com/Spartronics4915/exampleRobot/blob/master/src/org/usfirst/frc/team4915/robot/commands/LifterAutoCtlCmd.java

If any of these items seem mysterious to you any of the programming mentors or leaders should be able to help you with these subtle topics.

tom_wiggin (twigginthecool@gmail.com)
2017-01-21 18:01:03

wheres our simulator at?

dana_batali (dana.batali@gmail.com)
2017-01-21 19:26:09

we don’t have one

Clio Batali (cliombatali@gmail.com)
2017-01-22 11:45:26

Here’s a running catalogue of all of the motor/encoder assignments on the robot (subject to change): https://docs.google.com/document/d/1UOBnhfhSoBlfsNDcr38CnqWt0hnyTbl4eEsBq7iXnIw/edit?usp=sharing

riyadth (riyadth@gmail.com)
2017-01-22 11:46:47

Thanks Clio! Do you happen to know the gear ratio for the drivetrain gearboxes? That would be great to include on that sheet as well.

riyadth (riyadth@gmail.com)
2017-01-22 11:47:43

And if I recall correctly, the drivetrain encoders are connected to motor controllers 3 and 4, correct?

Clio Batali (cliombatali@gmail.com)
2017-01-22 11:47:54

Off the top of my head, no, but that's written down on the robot at the moment (easy to transfer over today)

Clio Batali (cliombatali@gmail.com)
2017-01-22 11:48:02

Yes for the encoders

binnur (binnur.alkazily@gmail.com)
2017-01-22 17:55:31

@binnur pinned a message to this channel.

} Clio Batali (https://spartronics.slack.com/team/U1GRPFY4T)
jack (jack@phroa.net)
2017-01-22 18:18:50

@binnur, https://github.com/phroa/2017-STEAMworks/blob/master/src/org/usfirst/frc/team4915/steamworks/OI.java

GitHub
jack (jack@phroa.net)
2017-01-22 18:19:02

Right now, the appropriate NetworkTables values are being sent to the

jack (jack@phroa.net)
2017-01-22 18:19:05

dashboard but the dashboard mysteriously no longer displays the actual

jack (jack@phroa.net)
2017-01-22 18:19:07

radio button chooser.

jack (jack@phroa.net)
2017-01-22 18:19:09

Using SmartDashboard SFX, the raw options can be seen in an ArrayView(?)

jack (jack@phroa.net)
2017-01-22 18:19:11

but I can't figure out how to show them on the normal dashboard.

jack (jack@phroa.net)
2017-01-22 18:19:13

Potential lead: SendableChooser reports its display type as "String

jack (jack@phroa.net)
2017-01-22 18:19:15

Chooser", we may have to do that explicitly in LoggerChooser even though

jack (jack@phroa.net)
2017-01-22 18:19:17

it extends that method and should report the same.

jack (jack@phroa.net)
2017-01-22 18:34:57

I might remove LoggerChooser again and put each logger's name in the dropdown menu choices, I only put LoggerChooser in since the 'real' SendableChooser isn't a NamedSendable for whatever reason, so you can't give it a label automatically.

binnur (binnur.alkazily@gmail.com)
2017-01-22 18:46:21

regarding sfx — I recall it being tricky — something about needing to edit the properties of the given module using right click options in edit mode. basically add it to the sfx first, and then configure its options. may/may not help

binnur (binnur.alkazily@gmail.com)
2017-01-22 18:47:01

remind me about the output you were seeing on the dashboard: basically the debug levels were showing up correctly, but not the label associated w/ the subsystem, such as intake. correct?

jack (jack@phroa.net)
2017-01-22 18:48:42

that's how it started - later on, the buttons disappeared entirely in smartdashboard 1.0 but you could see them in sfx

jack (jack@phroa.net)
2017-01-22 18:49:12

if I had a robot I'd explore this over the long week :)

binnur (binnur.alkazily@gmail.com)
2017-01-22 18:53:01

🙂 the issue you saw with buttons disappearing could be a caching issue between network tables and smartdashboard — requires to reset roborio and driver station at the same time, if I recall

binnur (binnur.alkazily@gmail.com)
2017-01-22 18:54:15

I am trying to remember our test from couple years back when we were playing w/ smartdashboard — I think we (and I believe I was working with you) concluded that adding a string did not work correctly. may relate to your comment on SendableChooser and String issue

binnur (binnur.alkazily@gmail.com)
2017-01-22 18:55:10

wonder if you can manipulate the network table directly - from 2016 "settings are persisted on the roboRio in /home/lvuser/networktables.ini"

binnur (binnur.alkazily@gmail.com)
2017-01-22 18:55:47

this could be a quick/dirty test on IF what you are thinking will render correctly

jack (jack@phroa.net)
2017-01-22 18:55:55

mm, yes

binnur (binnur.alkazily@gmail.com)
2017-01-22 18:56:19

(regarding access to the robot — depending on your availability during the week, you could ping the coach for access, unless you have finals too :)

jack (jack@phroa.net)
2017-01-22 18:57:07

my finals aren't until march

jack (jack@phroa.net)
2017-01-22 18:57:30

@coachchee, could I come in tuesday, thursday, or friday before 11 am?

binnur (binnur.alkazily@gmail.com)
2017-01-22 19:06:46

I am not seeing anything out of ordinary with code — so, break the problem down into pieces: 1) verify what you intend can display by seeing if you can manipulate network table directly; 2) see if your code is populating network tables — you may have already tried this: http://wpilib.screenstepslive.com/s/4485/m/26401/l/255424-verifying-smartdashboard-is-working 3) see if putData does what it is suppose to (given our surprise on not everything was rendered correctly in the past) —> just hard wire a menu independent of the subsystems And, remember SmartDashboard and network tables can get wonky — so, reset both to clear the cache

binnur (binnur.alkazily@gmail.com)
2017-01-22 19:09:23

also — make sure any initialization that is needed is done correctly, so SendableChooser is going to the right SmartDashboard instance. It is possible this is going to nowhere — haven’t used this outside of Robot.java before. hmm… I am not seeing any SendableChooser instantiation — this could be the issue

jack (jack@phroa.net)
2017-01-22 19:14:28

👍

coachchee (echee@bisd303.org)
2017-01-22 19:20:06

Jack, What time ?

jack (jack@phroa.net)
2017-01-22 19:21:27

it depends on what your finals schedule looks like, but I need to be leaving at or before 11. 9:30ish work on tuesday?

coachchee (echee@bisd303.org)
2017-01-22 19:23:08

I am assuming you need access to robot ? How about 8:30 am Tues, before I start teaching ?

jack (jack@phroa.net)
2017-01-22 19:23:16

ok

binnur (binnur.alkazily@gmail.com)
2017-01-22 19:27:42

@niklaspruen and @declanfreemangleason — good simple example of Position mode in ChiefDelphi — https://www.chiefdelphi.com/forums/showthread.php?t=153753

binnur (binnur.alkazily@gmail.com)
2017-01-22 19:33:58

See Talon SRX Programming guide on: 16.9. Why are there multiple ways to get the same sensor data?

binnur (binnur.alkazily@gmail.com)
2017-01-22 19:34:50
jack (jack@phroa.net)
2017-01-24 09:30:08

Not a fan of the fact that these are all sorted differently

jack (jack@phroa.net)
2017-01-24 09:30:56

anyway, thanks binnur! restarting the driver station and the robot at the same time cleared the values out, I wish there was a "empty NetworkTables" button on the driver station console though.

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:32:06

You maybe able to work around by deleting the cached files both from the driver station and roborio -- it needs to be done in both places.

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:32:42

🙂 welcome to software development -- it NEVER ends!

jack (jack@phroa.net)
2017-01-24 09:32:44

I couldn't actually find the files anywhere https://gist.github.com/phroa/722b4a36e37501250c9a10c12f280690

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:32:48

Awesome job!

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:33:37

So... IF you run through the quick test with those values set, and look at the driver station console logs, do they filter out correctly?

jack (jack@phroa.net)
2017-01-24 09:33:49

that's step two :)

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:34:10

have you looked at chief delphi? I'll do a search this evening

jack (jack@phroa.net)
2017-01-24 09:34:30

briefly

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:34:40

I am waiting on your cut/paste of the console output 🙂

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:34:54

nice progress 🙂

jack (jack@phroa.net)
2017-01-24 09:35:42

I'll have it in a sec, need to actually generate some log messages

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:36:11

perfect!

jack (jack@phroa.net)
2017-01-24 09:47:58

oh whoops I had the console scrolled up for that screenshot, let me go down to the current log

jack (jack@phroa.net)
2017-01-24 09:49:28

sending a screenshot from the driver station sucks

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:49:48

WOOT! Looks great!!! And, I see what you mean about the sorting order -- I had to rethink what comes first. @danabatali @liajohansen @jack I suggest we keep our levels short and to the point --> lets remove the 'notice'

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:50:10

you mean you are not using your camera to take a screenshot and then slacking? 🙂

jack (jack@phroa.net)
2017-01-24 09:50:40

I actually took a look at the smartdashboard code, it sorts the entries in the button list by their hashcode... so we basically can't control ordering

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:52:05

interesting! Your code looks great! Lets ship it 🙂

binnur (binnur.alkazily@gmail.com)
2017-01-24 09:52:34

other than reboot -- was there any other changes you needed to make?

jack (jack@phroa.net)
2017-01-24 09:53:53

yeah, I took out that LoggerChooser thing. (I thought it was the issue at first, but it probably wasn't. I just kept the code simple (no loggerchooser) once it fixed itself with a reboot)

jack (jack@phroa.net)
2017-01-24 09:54:00

I'll push an updated version in a second

jack (jack@phroa.net)
2017-01-24 09:55:14

we do have a bit of a race condition in that loggers initialized before the new initLoggers method in OI won't be filtered according to the buttons on the window, but I can't initialize the loggers after getting the user input because initLoggers relies on the loggers being initialized to know what loggers to filter

dana_batali (dana.batali@gmail.com)
2017-01-24 09:55:15

the more we battle with presentation for smart dashboard, the more easy it is for us to justify pynetworktables2js.... Coupled with a widget set like dojo or jqwiidgets, we would actually save time over the kinds of battles Jack is currently fighting (and that we fought last year)...

dana_batali (dana.batali@gmail.com)
2017-01-24 09:56:59

I'll put together a trivial proof of concept and share it at Friday's meeting

jack (jack@phroa.net)
2017-01-24 09:57:50

additionally, you can't change the filters without (at a minimum) using the "restart robot code" button on the station since I don't have a good idea of when to poll the sendablechooser for the current values. it seems like you can't register an event listener or anything that simple to run when a new option is selected

binnur (binnur.alkazily@gmail.com)
2017-01-24 10:06:15

Unfortunately my day of meetings started -- I'll catch up later. @jack ping me directly if you need to grab my attention. And, again , great job!! :)

chrisrin (chrisrin@microsoft.com)
2017-01-24 10:32:54

Thinking about the climber, I think it's possible there will be three motor states: off, slow (for catching & initially spooling the rope), and fast (for climbing). I'm not sure if that's enough to start working on anything, but just putting it out there.

jack (jack@phroa.net)
2017-01-24 11:25:29

@chrisrin Oh, so we are doing a climber? great!

jack (jack@phroa.net)
2017-01-24 11:41:18

intake team: since we're basically done, want to handle the climber? we can probably copy the intake code and replace Reverse with a half-speed mode. running the motor in PercentVbus at 1.0 and something like 0.3 for the slower speed should be fine, as "1.0 percent" ensures it's drawing as much power as it can take (right?)

riyadth (riyadth@gmail.com)
2017-01-24 12:01:01

Yes, 1.0 (100%) is letting the motor draw as much as it wants, which is definitely what you want for the fast climbing rate. For the slower "spooling" rate, we will have to adjust based on the mechanism design and reliability.

chrisrin (chrisrin@microsoft.com)
2017-01-24 12:02:11

@jack Not my call whether or not there will be a climber - that's for the captains to decide. I do think the team will make a run at it, and I've been trying to prompt some crowdsourced analysis and brainstorming over on the climber channel. I shared the thought about states here in case you all think it makes sense to do anything now in anticipation of there being a climber.

chrisrin (chrisrin@microsoft.com)
2017-01-24 12:16:01

One other thought on control: since the states fall in a repeating sequence (off to slow, slow to fast, and fast to off), the interface could be a single button I suppose. Not sure what's best, but I'm assuming less is more given the number of controls that may be on the station.

riyadth (riyadth@gmail.com)
2017-01-24 13:04:36

One big question might be if we can detect when to stop automatically, or if it has to be under driver control. If we can't detect based on a sensor, then we need to make sure the mechanism won't break itself if not used correctly by the driver.

jack (jack@phroa.net)
2017-01-26 13:59:20

regarding https://github.com/Spartronics4915/2017-STEAMworks/pull/17 - we still need to figure out why this doesn't work some of the time. I briefly looked at one log and it said something about exit code 127 being the reason it couldn't get the version string

GitHub
dana_batali (dana.batali@gmail.com)
2017-01-26 14:07:53

Two data points:

  1. the user must have git in their path... On windows this requires one to do use the system control panel, etc. (and depends on how git was installed on the system)
  2. there needed to be at least one tag available... I was getting this error when my branch had no tags.

Question for you: did the week2 tag automatically appear, or did you need to explicitly pull it?

jack (jack@phroa.net)
2017-01-26 14:08:16

so, I actually sent that PR from my other computer, so I had to run a fresh clone to get the repo. I think that pulled the tag with it.

dana_batali (dana.batali@gmail.com)
2017-01-26 14:08:54

that's good news... Probably a good idea to add a new tag, say week3, to see what effect that has

jack (jack@phroa.net)
2017-01-26 14:09:14

let's try that tomorrow

jack (jack@phroa.net)
2017-01-26 14:09:42

for what it's worth, week2 is the only tag I have. there's no week1

dana_batali (dana.batali@gmail.com)
2017-01-26 14:46:54

right... week2 is the only tag that's been added i bleev

dana_batali (dana.batali@gmail.com)
2017-01-26 15:24:17

h;m... @jack: i'm seeing different behavior on my windows git install. Specifically, unless i do:

git pull upstream master --tags

I don't see the new tags.

I just tested this by creating a new tag from the POV of github, then I did a git pull upstream master (no --tags),

I used git describe --tags to determine state

dana_batali (dana.batali@gmail.com)
2017-01-26 15:25:48

http://stackoverflow.com/questions/1204190/does-git-fetch-tags-include-git-fetch/1208223#1208223

stackoverflow.com
jack (jack@phroa.net)
2017-01-26 15:26:46

'git clone' does pull the tags, 'git pull' on an existing repo doesn't I believe

jack (jack@phroa.net)
2017-01-26 15:26:52

I had to use clone since this is a different computer

dana_batali (dana.batali@gmail.com)
2017-01-26 15:29:30

more on this from "git help pull": By default, tags that point at objects that are downloaded from the remote repository are fetched and stored locally. This option disables this automatic tag following. The default behavior for a remote may be specified with the remote.<name>.tagOpt setting. See git-config(1).

dana_batali (dana.batali@gmail.com)
2017-01-26 15:31:20

(this was from the --no-tags section)

jack (jack@phroa.net)
2017-01-26 15:32:27

what happens if you 'git pull upstream' without master? perhaps it's not pulling new tags since master is the only ref it was told to pull

dana_batali (dana.batali@gmail.com)
2017-01-26 15:33:06

since I've now polluted my local repo, I'm not sure I can answer this

jack (jack@phroa.net)
2017-01-26 15:33:45

might have to experiment with someone else's tomorrow then

riyadth (riyadth@gmail.com)
2017-01-26 19:50:24

Pulling should always pull tags (without any options). However, pushing does not push tags, unless specifically told to do so (git push --tags)

jack (jack@phroa.net)
2017-01-26 19:50:38

hmm

Clio Batali (cliombatali@gmail.com)
2017-01-27 19:45:03

http://www.vexrobotics.com/217-5049.html @jeremylipschutz @brianhutchison @ronan_bennett

brian_hutchison (savingpvtbrian7@gmail.com)
2017-01-27 20:08:14
riyadth (riyadth@gmail.com)
2017-01-27 22:42:51

Both the magnetic (217-5049) and optical (amt103) encoders are fast enough for our applications. The magnetic encoder can do absolute positioning to 6600 RPM, and quadrature to 15000, so it could be used to read the output of a CIM directly (and easily handle the agitator). The optical encoder can handle either 7500 or 15000 RPM, depending on the model number. Again, fast enough for direct mounting to a CIM.

binnur (binnur.alkazily@gmail.com)
2017-01-29 10:47:50

team, good overview of PID and Talon SRX. @declanfreemangleason @niklaspruen, see slide #20 on best practices for reversing sensor direction (using reverseSensor() instead of negating sign) https://docs.google.com/presentation/d/1_D8RkpKMOcsGaR1Tjba90VsQoI3Abe81k5sg9Yg9H2o/preview?slide=id.g138e6bb2e6_2_195

dana_batali (dana.batali@gmail.com)
2017-01-30 13:13:24

team: I took a crack at an even-simpler overview of motor controllers here. If you don't understand the stuff n the above presentation, please peruse this: https://docs.google.com/presentation/d/1L8-OFV8CBPUkS134OtNrn8gqD0vTJ9to8vfntubdxN0/edit?usp=sharing

dana_batali (dana.batali@gmail.com)
2017-01-30 13:13:26
dana_batali (dana.batali@gmail.com)
2017-01-30 13:26:20

btw: in the slides Binnur referenced, there are no calls in execute... This implies to me that robot safety has been disabled.

dana_batali (dana.batali@gmail.com)
2017-01-30 13:27:22

If you don't know what that means, please refer to my slides on motor safety.

tom_wiggin (twigginthecool@gmail.com)
2017-02-01 19:35:52

according to clio the entire electronics board is going vertical

tom_wiggin (twigginthecool@gmail.com)
2017-02-01 19:36:01

the entire thing

tom_wiggin (twigginthecool@gmail.com)
2017-02-01 19:36:12

it's insane

😁 tom_wiggin
dana_batali (dana.batali@gmail.com)
2017-02-02 09:31:25

programmers of position closed-loop modes: I found another example of closed loop control settings from the ctre people here: http://www.ctr-electronics.com/downloads/pdf/Talon%20SRX%20Software%20Reference%20Manual.pdf, page 83... The example is not for FRC code, but rather their c#/hero platform. The good news is that it's quite readable and has comments explaining many of the magic configuration settings. One mystery i can't explain: when they EnableClosedLoop: they SetVoltageRampRate(0); This may imply that they wish to proceed with no acceleration ramp. They do configure the PeakOutputVoltage to +3,-3, but since we're not talking FRC robots, that might not mean the same thing.

The example does wait 100ms after each call to SetPosition(0);

brian_hilst (brian@hilst.org)
2017-02-02 14:50:08

@dana_batali Thanks! Will take a look at it.

brian_hilst (brian@hilst.org)
2017-02-02 15:12:39

@dana_batali I was re-reading your slides and have a couple follow-up questions:

  1. On slide #17 it mentions setting closed loop peak outputs in the range of -1023, 1023. What method is this referring to? The only methods with “Peak” I see are for voltage.
  2. Slide 18 suggests just setting the F gain. Should PID all be set to zero then?
dana_batali (dana.batali@gmail.com)
2017-02-02 15:16:21

2: my reading is that F is primary useful in velocity control mode. There, one starts with PID=0, F>0... For position mode, I believe one starts with P > 0 and IDF=0.

dana_batali (dana.batali@gmail.com)
2017-02-02 15:18:34

1, these numbers represent the entire range of "throttle units"... I'm not entirely clear on the connection between throttle and voltage. I might guess that configPeakOutputVoltage may be what we're talking about here..

binnur (binnur.alkazily@gmail.com)
2017-02-02 17:02:02

@brian_hilst: I have shown the calculation between throttle units to rates in my sample code - see the constructor comments for some calculations

brian_hilst (brian@hilst.org)
2017-02-02 17:02:59

@binnur Ok. We’re testing your code now. First try it ran the motors continuously in opposite directions. Putting in some logging now.

brian_hilst (brian@hilst.org)
2017-02-02 17:03:07

Moving is better than not!

binnur (binnur.alkazily@gmail.com)
2017-02-02 17:03:13

On PID - start simple is always the recommendation - and that translates to P value and then determining if others are needed (usually by observation, like is there oscillation that is occurring that needs correction)

binnur (binnur.alkazily@gmail.com)
2017-02-02 17:04:03

Opposite direction means the reverseSensor and reverse motors are incorrect in code - I have a comment on that line

binnur (binnur.alkazily@gmail.com)
2017-02-02 17:06:35

(Typing on iPhone is hard - sorry for typos...)

dana_batali (dana.batali@gmail.com)
2017-02-03 09:45:54

note the negation of values sent to the right motors

dana_batali (dana.batali@gmail.com)
2017-02-03 11:10:11

Autonomous team: to summarize the plan of attack we all discussed last night.

@binnur mentioned that there were 4 options... I believe option #2 was dismissed leaving these three approaches:

  1. we continue with our auto-drive-straight approach, controlling two separate motors with two separate control modes. @brianhilst @niklaspruen might continue to tune their code and integrate it with the recent drivetrain changes.

  2. we implement a software PID (atop DriveDistancePIDCmd) that tries to ensure straight driving while achieving a repeatable distance (to within an inch). 2a. to modify the DriveDistancePIDCmd to sample the imu and adjust the rotation to stay on the initial imu heading. (I finally dug up last years' reference for this: 2016-Stronghold/src/..../commands/Drivetrain/DriveStraightCommand.java. 2b. described by @riyadth: to set one motor running the hardware PID/Position and to install a software PID that tries to track the primary motor's encoder position, My current sense it that fixing the DriveDistancePIDCmd is trivial so we should implement that quickly and validate it first thing Sunday. If we find it lacks sufficient accuracy, then we should proceed to 2b. @declanfreemangleason & @niklaspruen might pursue this.

  3. to implement a driver-recording/playback mode. There we'd need to sample the series of calls to arcadeDrive with the associated timestamp, then see if we can replay it. @jack was going to pursue this plan.

Regarding building toward some actual autonomous programs:

  1. a little amount of work remains in order to deliver a drive-to-line command. We need clarification from rule-masters on whether we need to simply break the line with the bumper or to entirely cross the line with the robot. Additionally, this command probably needs to behave differently according to initial position.

  2. LT has requested that we put shooting as our highest priority. If we can achieve sufficient accuracy with our driveStraight and our autoTurn commands, then we need to string a few of these into a CommandGroup and measure the accuracy of this "dead reckoning" approach. We discussed the need for two implementations - for the red and blue alliance configurations. If performance-capture proves successful, we simply need to record a few performances from a variety of starting locations and field configurations. Obviously we need to include launcher/agitator control here, so we'll need collaboration from @jeremylipschutz, @ronanbennett, @brian_hutchison .

  3. next on the list is the center-position sprocket-delivery. In theory, if we can accurately drive straight, we simply need to measure the precise distance from center-position to the sprocket-delivery location. We need to ensure that the delivery is both accurate and timely: that is we must allow time for the pilot to pull and deposit the sprocket into place. If we can't achieve reliable accuracy with dead-reckoning or performance-capture, we could try to move forward with PixyCam vision.

  4. last on the list is the side position sprocket-delivery. Again, if dead-reckoning or performance-capture is sufficiently accurate, it's only a matter of measurement plus managing the variations. And again, if dead-reckong or performance-capture isn't sufficient we'd need to investigate augmenting it with vision.

Please chime in with additional comments, observations, corrections etc!

👍 lia_johansen, Clio Batali
brian_hilst (brian@hilst.org)
2017-02-03 18:12:43

One addition that was discussed is using one or two switches on the bumper of the robot to detect contact with the boiler during autonomous. However, it might be sufficient to just program for a little extra distance and watch for the velocity to stop.

dana_batali (dana.batali@gmail.com)
2017-02-04 14:55:29

here are the details about IP address, mDNS and bandwidth limitations during competition...

http://wpilib.screenstepslive.com/s/4485/m/24193/l/291972-fms-whitepaper

brian_hilst (brian@hilst.org)
2017-02-05 10:30:20

Niklas and I will be coming after church later this morning.

binnur (binnur.alkazily@gmail.com)
2017-02-05 11:03:42

PID control rules of thumb -- And, here is a good resource on tuning PIDs https://youtu.be/UOuRx9Ujsog

jack (jack@phroa.net)
2017-02-05 13:33:56
    private static final double turnKp = 0.12;
    private static final double turnKi = 0;
    private static final double turnKd = 0.30;
    private static final double turnKf = 0.001;
jack (jack@phroa.net)
2017-02-05 13:34:18

those should be SmartDashboard values

niklas_pruen (niklas.pruen@gmail.com)
2017-02-05 17:45:22

Does anybody now the precision of the IMU? What is the smallest amount of degrees that it can measure?

jack (jack@phroa.net)
2017-02-05 17:51:52

1/16 of a degree, if I'm reading https://github.com/Spartronics4915/2017-STEAMworks/blob/master/src/org/usfirst/frc/team4915/steamworks/sensors/BNO055.java#L552 right

GitHub
niklas_pruen (niklas.pruen@gmail.com)
2017-02-05 17:54:34

that's pretty good.. thanks!

brian_hilst (brian@hilst.org)
2017-02-06 09:18:26

Has anyone heard about plans to work on the robot today?

brian_hilst (brian@hilst.org)
2017-02-06 09:18:30

Has anyone heard about plans to work on the robot today?

brian_hilst (brian@hilst.org)
2017-02-06 09:18:52

Has anyone heard about plans to work on the robot today?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-02-06 10:49:55

I don't think there are any.

lia_johansen (lilixlucky@gmail.com)
2017-02-08 20:21:19
jack (jack@phroa.net)
2017-02-08 20:35:44

So does the diamond plate extend to the boiler or is it a different wall material? :/

lia_johansen (lilixlucky@gmail.com)
2017-02-08 20:36:54

That's what we need to look at in the manual @jack

jack (jack@phroa.net)
2017-02-08 20:37:28

I wish the GDC would say yes or no in response to that question rather than "yeah it's in the rules"

dana_batali (dana.batali@gmail.com)
2017-02-09 09:42:27

this picture shows the diamond plate (back wall) continuing into the key. Raises the interesting possibility: we start in the key, shoot 10 balls, then deliver a sprocket to the side delivery spot)

dana_batali (dana.batali@gmail.com)
2017-02-09 09:47:10

and on the other autonomous question: from table 4.1:

AUTO mobility For each ROBOT that breaks the BASE LINE vertical plane with their BUMPER by T=0

dana_batali (dana.batali@gmail.com)
2017-02-09 09:51:18
jack (jack@phroa.net)
2017-02-09 15:08:52
    public final JoystickButton m_turnIMUStart = new JoystickButton(m_auxStick, 3);
    public final JoystickButton m_driveDistance = new JoystickButton(m_auxStick, 4);
    public final JoystickButton m_driveDistancePID = new JoystickButton(m_auxStick, 5);

    public final JoystickButton m_replayRecord = new JoystickButton(m_auxStick, 6);
    public final JoystickButton m_replayStop = new JoystickButton(m_auxStick, 7);
    public final JoystickButton m_replayReplay = new JoystickButton(m_auxStick, 9);

    public final JoystickButton m_intakeOn = new JoystickButton(m_driveStick, 7);
    public final JoystickButton m_intakeOff = new JoystickButton(m_driveStick, 9);
    public final JoystickButton m_intakeReverse = new JoystickButton(m_driveStick, 11);

    public final JoystickButton m_launcherOn = new JoystickButton(m_driveStick, 8);
    public final JoystickButton m_launcherOff = new JoystickButton(m_driveStick, 10);

    public final JoystickButton m_climberOn = new JoystickButton(m_driveStick, 8);
    public final JoystickButton m_climberOff = new JoystickButton(m_driveStick, 12);
    public final JoystickButton m_climberSlow = new JoystickButton(m_driveStick, 10);
jack (jack@phroa.net)
2017-02-09 15:09:12

is our current button layout. if anything you've written but not yet pushed conflicts, change your buttons!

jack (jack@phroa.net)
2017-02-09 15:09:48

additionally, the number of buttons on the two sticks is different! the left (aux) stick only has ten, I think.

jack (jack@phroa.net)
2017-02-09 15:11:10

a whole lot of button shuffling will happen later (as we might not even have two joysticks) but right now since we're about to merge everything together we need to make sure only one thing happens with each button.

jack (jack@phroa.net)
2017-02-09 15:11:27

notably: climber on and launcher on are both drive stick #8 !

jack (jack@phroa.net)
2017-02-09 15:11:43

same with climber slow and launcher off

dana_batali (dana.batali@gmail.com)
2017-02-09 19:59:54

*Thread Reply:* jack: i pointed this out to the launcher team and I understood that they have local (uncommitted) mods. Also, I made a similar comment on noah's pull request (which is still pending)

dana_batali (dana.batali@gmail.com)
2017-02-09 20:02:08

*Thread Reply:* It's probably a good idea for the launcher team to submit a pull request so we can achieve non-conflict, sooner rather than later.

jack (jack@phroa.net)
2017-02-10 21:26:54

*Thread Reply:* I'm not a huge fan of how threads aren't expanded by default, I'm not sure that @ronanbennett @brianhutchison @jeremy_lipschutz saw this. hopefully you all see it now :)

brian_hutchison (savingpvtbrian7@gmail.com)
2017-02-10 21:31:05

*Thread Reply:* I'm working on it right now

chrisrin (chrisrin@microsoft.com)
2017-02-09 15:55:33

That last comment makes me think a modular setup like this could be worth exploring in the future (pardon the domain 😀). http://www.beersmith.com/mame/

beersmith.com
dana_batali (dana.batali@gmail.com)
2017-02-09 20:05:31

*Thread Reply:* chrisrin: i don't think it's necessary/reasonable to expect drivers to actuate more than 3-5 control during a 3min competition. The issue we're currently having is that during software debugging, we need more ways to test a variety of software options and this is the source of of current conflict. That said, even with a need for only 3-5 controls, it might be cool to have them on a bling-y control panel rather than hiding on a gigantic (and confusing) joystick.

chrisrin (chrisrin@microsoft.com)
2017-02-10 10:28:32

*Thread Reply:* Thanks - I was just reacting to the idea of using a single control for more than one function. UI-wise, the risk of errors & confusion, especially in the heat of competition, seems high. A modular kit that can be adapted to various needs to create a very clear UI across all functions seems like an interesting option.

jack (jack@phroa.net)
2017-02-10 10:48:35

where I think that modular control would be really great is spreading out the controls as a board of buttons (in some kind of order) instead of putting them wherever they fit on a control joystick (especially as we don't even have a movable piece requiring actual joystick movement this year, so it just seems inconvenient)

chrisrin (chrisrin@microsoft.com)
2017-02-10 11:49:42

*Thread Reply:* jack: I could see that being a fun project for a couple people in the off- or pre-season. As far as projects go, I don't think it would be all that difficult to create a set of modules that could plug/play into a driver station using a kit like this... https://gameroomsolutions.com/shop/2-player-led-arcade-control-panel-bundle-kit/

Game Room Solutions
binnur (binnur.alkazily@gmail.com)
2017-02-10 18:17:05

*Thread Reply:* I have been envying those teams that have custom driver stations! I am all for this!! Lets bring Spartronics colors and maybe even logo 😉

jack (jack@phroa.net)
2017-02-10 10:48:51

preseason project for electronics + programming next year?

👌 Clio Batali, binnur
dana_batali (dana.batali@gmail.com)
2017-02-10 10:51:10

@jack: will you be at today's (weekday) meeting?

jack (jack@phroa.net)
2017-02-10 10:51:16

yes

dana_batali (dana.batali@gmail.com)
2017-02-10 10:51:21

excellent

binnur (binnur.alkazily@gmail.com)
2017-02-10 18:35:09

@declan_freemangleason here is the CAD for game field in Solidworks https://www.solidworks.com/sw/education/robot-student-design-contest.htm

solidworks.com
binnur (binnur.alkazily@gmail.com)
2017-02-10 18:35:49

here is some other versions (not sure what we use… 😕 ) https://www.chiefdelphi.com/forums/showthread.php?threadid=153108

chiefdelphi.com
dana_batali (dana.batali@gmail.com)
2017-02-11 15:26:21
dana_batali (dana.batali@gmail.com)
2017-02-11 15:29:23
dana_batali (dana.batali@gmail.com)
2017-02-12 12:18:56
dana_batali (dana.batali@gmail.com)
2017-02-12 12:25:57
dana_batali (dana.batali@gmail.com)
2017-02-12 12:29:51
lia_johansen (lilixlucky@gmail.com)
2017-02-12 14:20:23
binnur (binnur.alkazily@gmail.com)
2017-02-12 14:39:47

@channel for autonomous (or any accurate shooting), robot needs to be aligned in front of the boiler. Note: boiler is 42” wide, and robot w/ bumpers is 40”.

binnur (binnur.alkazily@gmail.com)
2017-02-12 15:39:31

@declanfreemangleason @jack @niklaspruen (and @danabatali @brianhilst) here is what I am thinking about optimizing our command groups for autonomous strategies. Note: I am not attached to any naming, just looking to how we can optimize our command groups for better maintenance.

/**
 ** Categories of command groups for autonomous
 ** goal: minimize the maintenance required as we optimize our autonomous moves
**/

/**
 ** DriveShootCross: from a given location, move to the boiler, shoot,
 **    and optionally cross the baseline
 ** Arguments:
 **      - pass in the starting location, i.e. a set landmark for robot positions
 **      - specify IF using playback motion to move (TRUE | FALSE)
 **      - specify IF crossing the baseline (TRUE | FALSE)
 **
 ** Note: use switch statement to specify move distance and turn values
 **      - note: each of these could be calls to the other command groups or commands
 **
 ** General flow -- may need to add 'waits' in between each activity:
 **    - move+turn and align w/ boiler
 **    - shoot 10 balls
 **    - move+turn to cross the baseline
 **    - stop
 **/

/**
 ** DriveCrossbaseline: from either left or right starting position, move
 **   forward a set distance to crossbaseline and stop
 **
 ** Note: existing drive forward w/ Position mode should be great for this!
 **/

/**
 ** DriveGearDrop: from middle location, drive backwards set distance, and
 **   cross fingers that it will magically align itself w/ lift
 **
 ** Note: experiment w/
 **   - Position mode for driving at low speed
 **   - PID controlled PercentVbus for driving at low speed
 **/

 /**
  ** BoilerDriveToGearDrop: after shooting, move to lift to drop off the gear
  **
  ** Note: this is only IF we actually have time left :)
  **   - arguments need to indicate **which** lift location to move to. We could
  **  assume this is always the nearest to the boiler, but it would need to be
  **  negotiated w/ alliance partners.
  **/
binnur (binnur.alkazily@gmail.com)
2017-02-12 15:47:35

Please optimize the commands to match our desired strategy priorities

brian_hilst (brian@hilst.org)
2017-02-12 22:41:44

Looking at the shooter position, can someone clarify where it is currently intended to shoot from? @binnur last stated it needed to be aligned in front of the boiler. Which side of the robot needs to be positioned there?

dana_batali (dana.batali@gmail.com)
2017-02-13 09:31:36

@brian_hilst - can you provide an update on when yours and niklas' code will be submitted? Did you request Declan to review it?

brian_hilst (brian@hilst.org)
2017-02-13 09:32:25

I will submit a pull request this morning.

2017-02-13 09:34:18

@danabatali commented on @liajohansen’s file Image uploaded from iOS: regarding #5, please NOTE: if you drop off the gear in the central station, you will have crossed the line - we only need to break the plane, not entirely cross with our robot.

dana_batali (dana.batali@gmail.com)
2017-02-13 10:19:38

@riyadth : regarding bandwidth: we are allowed 7Mbits/sec, control packets consume 100kb/sec, leaving 6.9Mb/sec for camera. (Of course our own smart-dashboard traffic isn't included, so really we have less).

dana_batali (dana.batali@gmail.com)
2017-02-13 10:20:04
dana_batali (dana.batali@gmail.com)
2017-02-13 10:21:36
dana_batali (dana.batali@gmail.com)
2017-02-13 10:25:11

@riyadth - here's a reddit thread that may pour some cold water on the notion of two usb cameras: https://www.reddit.com/r/FRC/comments/2syxn7/multiple_usb_cameras/

reddit
dana_batali (dana.batali@gmail.com)
2017-02-13 10:26:31

*Thread Reply:* https://forums.ni.com/t5/FIRST-Robotics-Competition/Can-2-USB-Cameras-be-setup-on-this-year-s-robot/td-p/3468551

forums.ni.com
riyadth (riyadth@gmail.com)
2017-02-13 21:56:15

*Thread Reply:* I'd hardly say cold water... More like a refreshing mist :-) I found several solutions on-line that look good. I am particularly fond of this system: https://github.com/RingOfFireOrg/FRC_2017_Competition/blob/2d3162537542a9c7b3258601be33c2d5122d84a3/src/org/usfirst/frc/team3459/robot/Cameras.java

riyadth (riyadth@gmail.com)
2017-02-13 21:56:52

*Thread Reply:* It should also let us address the TCP port issue, which was a great observation in your earlier post (Thanks!)

riyadth (riyadth@gmail.com)
2017-02-13 21:57:37

*Thread Reply:* I'll give the two camera solution a spin next time I have access to the robot. Maybe I'll be able to find a student to take the concept and run with it.

dana_batali (dana.batali@gmail.com)
2017-02-14 08:59:34

*Thread Reply:* the idea of switching between two camera feeds has a few pluses and minuses, but I think the pros probably outweigh the cons:

Pro Switched Camera Feeds: * better use of bandwidth * that would allows us to increase the frame rate or image size ** less distracting than two images on the screen

Con Switched Camera Feeds: * requires user to press the switch, more programming time. * less available information to the drivers at any instant

Regarding manpower: since we do have two IP camera solution working right now, we just need to make sure this change isn’t too distracting (or is appropriately distracting:-))

Regarding pros and cons of IP vs USB camera

Pro USB:
* fewer wires (no IP hub/switch) no camera power cords * simpler (no additional IP addresses, configuration wizards, etc) ** support for a wider, cheaper range of cameras (we have 3 already)

Pro IP: * currently works, no roborio software required * more graceful failure cases since it doesn’t depend upon roborio ** more tested than usb cameras (both for us and globally)

dana_batali (dana.batali@gmail.com)
2017-02-14 09:02:19

*Thread Reply:* So I guess before we press to the next step we want to get buy-in from the student leadership team?

dana_batali (dana.batali@gmail.com)
2017-02-14 09:03:22

*Thread Reply:* @coachchee @Clio Batali - any thoughts on this thread?

coachchee (echee@bisd303.org)
2017-02-14 09:43:22

*Thread Reply:* I will have Clio discuss with leadership on Wed at 3 pm. Swtich camera feed or not ? USB vs IP ? I will defer to you and Riyadth since I have not done any research. I suggest you tell Clio what you and Riyadth recommend and let student leadership make final decision. Thanks

john_sachs (johncsachs@gmail.com)
2017-02-13 21:29:30

@brian_hilst: the current plan is to shoot up and over the back (gear holder end) of the robot, so you would back the robot up so that its centered and flush with the boiler. That said, the shooter can be adjusted to launch over the front (intake end) of the robot. A couple possible benefits of shooting over the front: the intake would be flush against the boiler rather than exposed to taking hits from other robots playing defense against us, the balls might have a slightly better trajectory, and you may be able to get away with only needing one camera (instead of front and back). Downside is you may lose some ball storage if you shoot over the front (have to keep a lane clear). Needless to say, lots of testing and decisions to be made over the next few days.

dana_batali (dana.batali@gmail.com)
2017-02-13 21:33:55

@johnsachs and @brianhilst there is an ongoing thread in launcher that is color coded to help better communicate

chrisrin (chrisrin@microsoft.com)
2017-02-13 23:23:31

free(ish) stuff: I believe the coach recently acquired some 775pro motors for the team, and I noticed this CD thread that points out armabot.com has provided a free voucher in the kit of parts book for a $25 "Bad Boy" encoder that works that motor - just need to pay $5 to $7 shipping according to the thread - might be worth picking up since we have the motor - here's the thread: https://www.chiefdelphi.com/forums/showthread.php?threadid=153323

chiefdelphi.com
coachchee (echee@bisd303.org)
2017-02-13 23:34:45

I ordered it already. Clio has it in the electronics section . It came in 2 weeks ago.

👍 Clio Batali
lia_johansen (lilixlucky@gmail.com)
2017-02-15 16:10:39

Since we will not have the real robot until Friday, that puts us back in testing so programming will be meeting Saturday 2/18 from 1 pm - 6pm and meeting early on Sunday 2/19 from 9-1 pm and then staying for the real meeting 1pm-5 pm

binnur (binnur.alkazily@gmail.com)
2017-02-15 16:10:59

@channel ^^^

timo_lahtinen (timolahtinen1@gmail.com)
2017-02-15 17:06:29
dana_batali (dana.batali@gmail.com)
2017-02-15 17:06:38
2017-02-16 08:05:31

@danabatali commented on @danabatali’s file Pasted image at 2017-02-15, 5:06 PM: this shows that they've extended the legal ports to include multiple roborio http server ports (1180-1190)

dana_batali (dana.batali@gmail.com)
2017-02-16 08:49:43

@jack, @binnur @declan_freemangleason

dana_batali (dana.batali@gmail.com)
2017-02-16 08:57:01

regarding auto selection:

  • Jack's pull request has been merged... I made a few suggestions regarding display ordering, but that's a cosmetic issue, not a functional one.

  • Jack makes the distinction between "Presets" and "Recordings". Presumably Declan will begin to populate the presets options

  • I understood from declan that his commandgroups may be parameterized by initial field location. Since this information is provided by the driver, we'll need to convey this to interested CommandGroups/SubCommands via a "pull" from the SmartDashboard. I suggest we adopt the naming convention "AutoPosition" and "AutoPositionOptions" for this SmartDashboard field. I will make the changes to the SmartDashboard to support this. It will be up to Declan to populate AutoPositionOptions and to query "AutoPosition" in his commands. Declan should be able to follow the pattern for AutoStrategy and AutoStrategyOptions.

  • I'm not sure how best to specify initial field position. There are more than 3 potential positions that touch the diamond plate and there are also at least two cases where the orientation of the robot matters. (nose vs butt to wall, oriented at an angle so we don't need to move in order to shoot)

  • We do know which alliance we're on (through the HAL interface), so we won't need the driver to provide us this information

jack (jack@phroa.net)
2017-02-16 09:54:31

We can tack a .sort() on https://github.com/Spartronics4915/2017-Dashboard/blob/master/www/js/pg_driver.js#L43 to address your cosmetic issue

github.com
jack (jack@phroa.net)
2017-02-16 09:56:16

regarding parameterized commandgroups, I would suggest having three presets in the list for each command: Command - 1, Command - 2, Command - 3 or something to represent starting position. it would make for a rather large dropdown menu but I think the decrease in complexity would be worth it. those names would map to commandgroups preinitialized with the starting position as a constructor parameter

jack (jack@phroa.net)
2017-02-16 09:57:50

that might get slightly too ugly when we account for potential angles within the starting locations, not sure

dana_batali (dana.batali@gmail.com)
2017-02-16 10:47:00

hm - seems to me that expands exponentially with increasing numbers of starting positions. Perhaps a concrete example of the two scenarios is in order.

and regarding sorting, that sorta works, but never as well as explicit ordering in my experience.

dana_batali (dana.batali@gmail.com)
2017-02-16 10:54:44

Example with a single strategy menu:

Preset Fixed Shoot (Happy Place) Preset Center Sprocket Preset Non-Center Line Cross Preset Fixed Shoot (Happy Place) + Line Cross Preset Moving Shoot From Position 1 Preset Moving Shoot From Happy Place plus Sprocket Drop

Recorded Fixed Shoot (Happy Place) Recorded Center Sprocket Recorded Non-Center Line Cross Recorded Fixed Shoot (Happy Place) + Line Cross Recorded Moving Shoot From Position 1 Recorded Moving Shoot From Happy Place plus Sprocket Drop

dana_batali (dana.batali@gmail.com)
2017-02-16 10:55:12

(which, I guess works, but isn't alphabetized)

dana_batali (dana.batali@gmail.com)
2017-02-16 10:56:19

(but I haven't included a position 3 variant, nor have I presented orientation)

dana_batali (dana.batali@gmail.com)
2017-02-16 10:57:13

Conclusion: with this set, I'm now convinced that we don't need/want a separate position menu, since it would imply that position and strategy are orthogonal/independent)

dana_batali (dana.batali@gmail.com)
2017-02-16 10:58:19

in other words, I believe that certain strategies only work from certain positions

riyadth (riyadth@gmail.com)
2017-02-17 22:26:38

CTRE has updated their Toolsuite software (for Talon SRX). Update your installations to avoid unhappiness! http://www.ctr-electronics.com/control-system/hro.html#product_tabs_technical_resources

binnur (binnur.alkazily@gmail.com)
2017-02-18 09:44:34

@channel ^^^

binnur (binnur.alkazily@gmail.com)
2017-02-18 12:42:56

@dana_batali which process do I need to kill?? looks like my ctrl-c didn’t kill the processes and closed the sockets cleanly (master) binnur@Chiana(2017-Dashboard)&gt; python DashboardServer.py 12:40:05 INFO : dashboard: Connecting to networktables at 10.49.15.2 12:40:05 INFO : nt : NetworkTables 2017.0.5 initialized in client mode 12:40:05 INFO : dashboard: Networktables Initialized Traceback (most recent call last): File "DashboardServer.py", line 69, in &lt;module&gt; robotlog = Robotlog.Robotlog() File "/Users/binnur/Development/Spartronics/2017-Dashboard/pylib/Robotlog.py", line 79, in __init__ self.udpsock.bind((addr, port)) File "/usr/local/Cellar/python/2.7.13/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py", line 228, in meth return getattr(self._sock,name)(**args) socket.error: [Errno 48] Address already in use

tom_wiggin (twigginthecool@gmail.com)
2017-02-18 17:24:59

I will no longer attend meetings until next year because I feel I haven't been very helpful and entering a competition the last thing we need is unhelpful people 🙂 I look forward to next year and by then I will hopefully have actually learned some programming!

🙍 chris_mentzer
tom_wiggin (twigginthecool@gmail.com)
2017-02-18 18:16:56

:hurtrealbad:

lia_johansen (lilixlucky@gmail.com)
2017-02-18 18:49:07

Hey everyone, so a reminder that tomorrow we will be meeting at 1 pm until late. Please bring food for dinner or money to buy dinner. See you then!

binnur (binnur.alkazily@gmail.com)
2017-02-19 11:01:50

@brianhilst @declanfreemangleason the DriveCommandGroup seems pretty straightforward to use. Two things to add:

  1. Stop command
  2. Print logs so we can see what we asked the robot to do

Note: in code I don’t see anything that logs ‘there is the autonomous command driver has chosen’ and ‘here is the steps we executed’. This is REALLY important as it is very common to hear from drivers ‘hey, I told robot to do xyz, and it didn’t. The only proof we have is what we log, and if that is nothing, then we have to take the word of the driver and trust me, that is hard to debug

binnur (binnur.alkazily@gmail.com)
2017-02-19 11:02:47

^^^ we need to make sure this information is printed regardless of if debug filters are turned down.

binnur (binnur.alkazily@gmail.com)
2017-02-19 11:08:57

^^^ thinking… we can also omit the stop command, but at the end of the command group, just keep sending ‘stop’ to the drivetrain until autonomous ends.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-02-19 11:52:55

Yeah, that makes sense. I'll be sure to add it.

brian_hilst (brian@hilst.org)
2017-02-19 12:04:59

@binnur Thanks!

binnur (binnur.alkazily@gmail.com)
2017-02-19 12:12:23

@brianhilst @declanfreemangleason — for driving straight, what are we using — 1) position mode w/ IMU, 2) percentVbus w/ IMU, 3) undecided and need to make a decision? (I am a confused mentor…)

riyadth (riyadth@gmail.com)
2017-02-19 12:23:42

@jack I am concerned if path recording is inadvertently started by the drivers during competition, there is a chance that we would use a lot of memory by the end of the 2.5 minute match. Will the drivers be aware, and able to turn recording off? Or is there some other safety feature we can put in place, such as only allow recording to be enabled when not in an actual match (using information from the field management system)?

binnur (binnur.alkazily@gmail.com)
2017-02-19 12:23:45

todo - in @declan_freemangleason branch, lets code review public void setControlMode(TalonControlMode m, double fwdPeakV, double revPeakV, double P, double I, double D, double F)

binnur (binnur.alkazily@gmail.com)
2017-02-19 12:24:03

^^^ configPeakOutputVoltage is applicable to closed loop modes, right?

jack (jack@phroa.net)
2017-02-19 12:24:54

riyadth: good point. the recording controls are already located on the alt drive stick (which we shouldn't be touching at all) but I can add something like a 15 second limit or unchecked-by-default checkbox on the dashboard

binnur (binnur.alkazily@gmail.com)
2017-02-19 12:26:48

explicit control on the dashboard (check/uncheck recording mode) is more fool proof

binnur (binnur.alkazily@gmail.com)
2017-02-19 12:26:56

and should be off by default

binnur (binnur.alkazily@gmail.com)
2017-02-19 12:27:07

at least for me 🙂

riyadth (riyadth@gmail.com)
2017-02-19 12:31:44

Question for the team. I see that in SpartronicsSubsytem m_initialized defaults to 'true'. This seems odd, as it is set to true before the subsystem is actually initialized. Should this be 'false' by default, and each subsystem then needs to set it to true when initialization is finished? Otherwise, subsystems will appear to be initialized while they are initializing, which could cause improper execution of methods.

binnur (binnur.alkazily@gmail.com)
2017-02-19 12:51:55

We should also talk about timeouts that we can add to commands - it can be useful for turning and stopping us from forever dancing in the field

binnur (binnur.alkazily@gmail.com)
2017-02-19 16:54:32

@riyadth ```04:13:12 NOTICE OI: ================================================= #

A fatal error has been detected by the Java Runtime Environment:

#

SIGSEGV (0xb) at pc=0xac611458, pid=6192, tid=2885981280

#

JRE version: Java(TM) SE Embedded Runtime Environment (8.006-b23) (build 1.8.006-b23)

Java VM: Java HotSpot(TM) Embedded Client VM (25.6-b23 mixed mode linux-arm )

Problematic frame:

C [libcscore.so+0x21458] cs::MjpegServerImpl::ConnThread::SendStream(wpi::rawsocketostream&)+0x3b0

#

Core dump written. Default location: //core or core.6192 (max size 2048 kB). To ensure a full core dump, try "ulimit -c unlimited" before starting Java again

#

An error report file with more information is saved as:

//hserrpid6192.log

If you would like to submit a bug report, please visit:

http://bugreport.sun.com/bugreport/crash.jsp

# ➔ Launching «'/usr/local/frc/JRE/bin/java' '-Djava.library.path=/usr/local/frc/lib/' '-jar' '/home/lvuser/FRCUserProgram.jar'» ```

dana_batali (dana.batali@gmail.com)
2017-02-19 17:59:26

*Thread Reply:* @riyadth get's the prize! First one to crash the java runtime!

lia_johansen (lilixlucky@gmail.com)
2017-02-19 17:28:21

@channel Change of schedule: We are meeting tomorrow (Monday) from 1-7 to allow mechanics to work on the robot.

Terry (terry@t-shields.com)
2017-02-19 19:55:47

@Terry has joined the channel

binnur (binnur.alkazily@gmail.com)
2017-02-19 20:48:37

Auto driving and shooting w/ a stuck launcher

binnur (binnur.alkazily@gmail.com)
2017-02-19 20:50:34

Dancing robot - shoots and works towards the cross line but wants to dance instead

binnur (binnur.alkazily@gmail.com)
2017-02-19 20:52:14

Auto playback moves w/ shooting

binnur (binnur.alkazily@gmail.com)
2017-02-19 21:58:13

@channel here is some notes/observations from today’s activities. Anything else I am missing? Any corrections or questions?

  • Need to tune PID for IMU turning —> applies to when when backing up from the boiler to cross baseline
  • Driving backwards is not incorporating the IMU heading correctly (or applying the correction to the wrong motor?)
  • Why are we seeing ‘Output not updated enough?’ stop() writes to both set(0) to motor and arcadeDrive(0,0)
  • Autonomous Launcher: should we add a timeout to launch command to avoid situations when launcher doesn’t end for whatever reason so autonomous strategy continues to run?
  • Why do we need to keep resetting the robot code?
rose_bandrowski (rose.bandrowski@gmail.com)
2017-02-19 21:58:26

@rose_bandrowski has left the channel

Clio Batali (cliombatali@gmail.com)
2017-02-19 22:46:23

@channel: Master code is still faulty! We're having lots of issues of commands being initiated to OFF - this is the same issue the launcher group experienced today, but now is affecting the launcher, climber, and intake. That said - most things are working, despite launcher jams

Clio Batali (cliombatali@gmail.com)
2017-02-19 23:02:49

Also: please set intake to 0.9 NOT 0.75 (where it is now)

coachchee (echee@bisd303.org)
2017-02-19 23:34:48

Lia , can you please have someone in programming type out what every button on the controller does . A legend sheet . Thanks

jack (jack@phroa.net)
2017-02-19 23:39:50

I thought Clio had one? Here's the code (until a proper list can be made) if it helps. https://github.com/Spartronics4915/2017-STEAMworks/blob/master/src/org/usfirst/frc/team4915/steamworks/OI.java#L54-L90

github.com
coachchee (echee@bisd303.org)
2017-02-19 23:42:09

lets attach to driver station, thanks

lia_johansen (lilixlucky@gmail.com)
2017-02-19 23:46:07

@coachchee yes. I have them written in my notebook. Will make a cleaner copy tomorrow

coachchee (echee@bisd303.org)
2017-02-19 23:48:31

Thanks

sean_hooyer (seanhooyer@gmail.com)
2017-02-20 11:14:04

@sean_hooyer has joined the channel

sean_hooyer (seanhooyer@gmail.com)
2017-02-20 11:17:27

The top part of the boiler is marked correctly but constructed incorrectly. We can fix it, but we are waiting for permission of the programming team. (2 inches off)

binnur (binnur.alkazily@gmail.com)
2017-02-20 11:25:41

@binnur pinned their Image Screen Shot 2015-02-04 at 4.45.04 PM.png to this channel.

binnur (binnur.alkazily@gmail.com)
2017-02-20 11:36:20

@sean_hooyer we need an accurate boiler for testing and tuning. programmers will be in at 1pm today. Please fix ASAP. Thank you!

:clio: lia_johansen
binnur (binnur.alkazily@gmail.com)
2017-02-20 11:36:54

btw - what part is off? height or diameter or ??

Clio Batali (cliombatali@gmail.com)
2017-02-20 11:38:05

The top portion is just aligned incorrectly with the base - we will discuss when programmers arrive (should be a quick fix)

binnur (binnur.alkazily@gmail.com)
2017-02-20 11:38:41

works! we can work on the turning accuracy to start with

binnur (binnur.alkazily@gmail.com)
2017-02-20 11:40:57

@declan_freemangleason please add a test button so we can test turn right as well as turn left today. Pls program values so that we can simulate and adjust PID values

  • turns to boiler from both red + blue alliance
  • turns to cross the baseline from both red + blue alliance
binnur (binnur.alkazily@gmail.com)
2017-02-20 11:41:44

^^^ trying to avoid extra steps required to reposition and run autonomous code

lia_johansen (lilixlucky@gmail.com)
2017-02-20 11:46:02

@binnur we also need to make sure all of the code on master is working. Right now, it is faulty and unreliable

binnur (binnur.alkazily@gmail.com)
2017-02-20 11:54:34

K - please make a list of known issues in prioritized order and assign to teams

lia_johansen (lilixlucky@gmail.com)
2017-02-20 11:55:10

I don't know the faults yet. Clio will inform me when i arrive

👍 binnur, Clio Batali
binnur (binnur.alkazily@gmail.com)
2017-02-20 11:55:12

a checklist of test cases prior to bagging is a good thing to do 🙂

lia_johansen (lilixlucky@gmail.com)
2017-02-20 11:55:19

Launcher fix is first

lia_johansen (lilixlucky@gmail.com)
2017-02-20 11:55:32

When they do that I'll figure out the rest of the priority list

lia_johansen (lilixlucky@gmail.com)
2017-02-20 11:55:38

👍:skin-tone-3:👍:skin-tone-3:👍:skin-tone-3:

👍 binnur, Clio Batali
sean_hooyer (seanhooyer@gmail.com)
2017-02-20 12:02:05

@binnur we fixed the boiler problem

👍 lia_johansen, binnur, michelle_dalton
💥 binnur
lia_johansen (lilixlucky@gmail.com)
2017-02-20 12:09:19

Thanks @sean_hooyer

binnur (binnur.alkazily@gmail.com)
2017-02-20 12:12:00

Thank you @sean_hooyer

declan_freemangleason (declanfreemangleason@gmail.com)
2017-02-20 13:13:47
jack (jack@phroa.net)
2017-02-20 13:22:51

@brianhutchison @jeremylipschutz try replacing this with m_launcher.setLauncher(LauncherState.OFF) https://github.com/Spartronics4915/2017-STEAMworks/blob/master/src/org/usfirst/frc/team4915/steamworks/commands/LauncherCommand.java#L73

github.com
binnur (binnur.alkazily@gmail.com)
2017-02-20 14:20:24

@jack to resolve the version path issue w/ git - is that documented anywhere?

timo_lahtinen (timolahtinen1@gmail.com)
2017-02-20 17:21:40

For future problem solving, my git path is C:\Program Files\Git\mingw64\bin

brian_hilst (brian@hilst.org)
2017-02-21 09:28:50

@niklas_pruen, @binnur and I completed and tested a new DriveCurveCommand to use in getting from the boiler to the baseline during autonomous faster. It appears to work well and is low risk. The ParameterizedCommandGroup was also updated to support a new “Curve” command and the group now supports variable number of parameters per command (e.g. Shoot takes no params and Curve takes 3).

Here is what remains for today:

  1. @timolahtinen @liajohansen review and merge the pull request.
  2. @declanfreemangleason @niklaspruen Integrate and tune the new “Curve” command into the "Drive Shoot and Cross Baseline Position 3” group. A new "Cross baseline from boiler” was added to help tune the distance.
  3. Group needs to determine how best to increase the speed as the default is too slow to make it across the baseline in 15secs reliably. We found setting peak voltage in the Drivetrain.setControlMode() has no effect. Drivetrain setMaxOutput needs to be increased to at least 0.4. Our current thinking is to only do this for this command to avoid impacting other driving commands. We will need to test with the other auto commands to see if it helps get our time down.
brian_hilst (brian@hilst.org)
2017-02-21 09:35:00

@timolahtinen @liajohansen Reminder that we would like to reconfigure the test field down by the 300 building main doors to provide a larger space for autonomous testing. The current location does not allow driving across the baseline. We should also remeasure and verify the key field positions for the boiler, key, baseline & gear lift(s) to make sure our autonomous commands work correctly on the real field and the opportunity for more accurate driving practice.

lia_johansen (lilixlucky@gmail.com)
2017-02-21 09:46:17

@brian_hilst just merged the code. We can set up the field for you down the hall. Launcher group will be getting the robot first to do some testing and then autonomous will get it for the rest of time. Robot will be in Programming from 1-4 pm

brian_hilst (brian@hilst.org)
2017-02-21 09:47:19

@liajohansen Thanks! @niklaspruen and I will go over early to work on moving so we’re ready to test at 1pm

lia_johansen (lilixlucky@gmail.com)
2017-02-21 09:47:55

@brian_hilst i dont know if doors will be open before 1 pm

lia_johansen (lilixlucky@gmail.com)
2017-02-21 09:48:15

and auto people will get the robot probably around 1:30? 1:45 ish

brian_hilst (brian@hilst.org)
2017-02-21 09:48:31

@lia_johansen Ok. It shouldn’t take long to do at 1pm then.

lia_johansen (lilixlucky@gmail.com)
2017-02-21 09:49:55

@brian_hilst agreed. That's awesome you and niklas got a curve!

chrisrin (chrisrin@microsoft.com)
2017-02-21 09:50:08

It was awesome to see things come together at the end of last night, and I think the work of the programmers was a big contribution. One thing I noticed on the approach to the climber and other game tasks that would be a major help imo is the ability to turn slowly with finesse and accuracy. It seemed like when a very small turn was needed what happened instead was a large nearly 90 degree turn. On the Xbox controller there is an extra joystick not yet used yet as I understand... could it be used for slower more precise turning? Maybe a way to trigger a tiny pulse of current to nudge the robot with a single press left or right?

lia_johansen (lilixlucky@gmail.com)
2017-02-21 09:54:04

@chrisrin Yes, potentially

lia_johansen (lilixlucky@gmail.com)
2017-02-21 09:58:57

@chrisrin alex said the turning seemed fine he just needs to practice. Also when the robot was approaching the boiler it was not on the carpet; that contributes to the way the robot turns

binnur (binnur.alkazily@gmail.com)
2017-02-21 10:00:02

@lia_johansen @riyadth and I would like to work w/ the launcher team to see if we can tune the launcher values further — we are building some theories to test w/. fyi

lia_johansen (lilixlucky@gmail.com)
2017-02-21 10:00:53

@binnur sounds great. Thanks! Could you test that in 1.5 hrs or less?

binnur (binnur.alkazily@gmail.com)
2017-02-21 10:01:37

yup - I really hope so… I don’t think anyone could last tuning for that long and not loose their mind 😉

:clio: lia_johansen
chrisrin (chrisrin@microsoft.com)
2017-02-21 10:01:53

@lia_johansen ah ok - the robot just seemed a bit lurchy on the in-place turns when trying to line things up, and I assumed it was just a characteristic of tank drive so a nudge left / nudge right function could save cycle time - of course practice will help a lot

lia_johansen (lilixlucky@gmail.com)
2017-02-21 10:04:42

@chrisrin Considering a time constraint practice driving will be the best option. Jack got pretty good at turning and driving with auto testing, alex will get there too

chrisrin (chrisrin@microsoft.com)
2017-02-21 10:06:25

could be good to have for back up drivers that won't get much practice time... time permitting

lia_johansen (lilixlucky@gmail.com)
2017-02-21 10:08:35

@binnur ^^^ what do you say about the slower turning?

alex_lf (al3xlf@gmail.com)
2017-02-21 10:09:15

I noticed the turning was different on carpet vs the hallway floor which is what we used for climbing. Not sure if this has been said

✅ lia_johansen
alex_lf (al3xlf@gmail.com)
2017-02-21 10:09:54

Turning was great on carpet I had no issues there, I just need more practice

declan_freemangleason (declanfreemangleason@gmail.com)
2017-02-21 10:09:56

@lia_johansen @binnur @chrisrin Yeah, I wanted to mention that this might be a non-issue because the robot's movement on the carpet vs. off is very different.

✅ lia_johansen
binnur (binnur.alkazily@gmail.com)
2017-02-21 10:10:07

I didn’t see issues w/ turning - however, one needs to get adjusted w/ driving using the throttle button down while turning. the xbox controller, for a mentor, requires more training

lia_johansen (lilixlucky@gmail.com)
2017-02-21 10:10:39

@binnur where is the throttle button?

lia_johansen (lilixlucky@gmail.com)
2017-02-21 10:10:50

Ik where it is on the regular joystick

binnur (binnur.alkazily@gmail.com)
2017-02-21 10:10:53

as @brian_hilst indicated, we should explore cracking up the maxoutputspeed of the robot and see how teleop behaves

binnur (binnur.alkazily@gmail.com)
2017-02-21 10:11:33

(on the xbox, to turn, I had to keep pushing the move forward button w/ turns. and that controlled the turning action to better accuracy)

chrisrin (chrisrin@microsoft.com)
2017-02-21 10:12:10

thx for considering the idea - I would like to move the climbing rope to the carpet today & see how much easier it is for driver to line up - I want those 50 points every match 🙂

lia_johansen (lilixlucky@gmail.com)
2017-02-21 10:12:34

@chrisrin agreed! We can do that.

lia_johansen (lilixlucky@gmail.com)
2017-02-21 10:13:02

@binnur outline for our time with robot (1-4 pm). Launching then autonomous then climber

👍 binnur
binnur (binnur.alkazily@gmail.com)
2017-02-21 10:15:26

and lots of driving 🙂

✅ lia_johansen
👍 chrisrin
coachchee (echee@bisd303.org)
2017-02-21 11:11:14

Repost from Paul.

} Paul Vibrans (https://spartronics.slack.com/team/paul_vibrans)
binnur (binnur.alkazily@gmail.com)
2017-02-21 11:37:22

@lia_johansen ^^^ pls check w/ programmers on how we can best do this

binnur (binnur.alkazily@gmail.com)
2017-02-21 11:37:49

Launcher team, here is what I am thinking for a plan. Please review and adjust as needed. See you soon.

  1. Lets make sure talon is setup in code
m_launcherMotor.setAllowableClosedLoopErr(0);  // this may cause oscillation we need to observe behavior
m_launcherMotor.setCloseLoopRampRate(0.0);     // lets get to our max output quickly
m_launcherMotor.setVoltageRampRate(0.0);          // lets get to our max output quickly
m_launcherMotor.enableBrakeMode(false);            // defensive programming in case unexpected things happen in configurations
  1. Verify FF is set correctly using RIO webpage (our calculations show 0.04995 for 3000RPM)
    • zero PID
    • set slider to 3000RPM (assuming this is the set RPM we are targeting)
    • adjust FF till webpage RPM is 3000
    • using this FF set slider to 1500RPM —> verify webpage RPM is ~1500
  2. Adjust PID values and test w/ balls to ensure system behaves
    • Increase k_p till we see oscillation
    • Adjust ki and kd
  3. Question: how does isLauncherAtSpeed() works?
  4. Once tuned, please set these values in code to make sure they are saved regardless of firmware updates

And, if this process works, please copy and include it as comments in your launcher code for next year.

Note to self: we may need to set i=zone given k_i.

lia_johansen (lilixlucky@gmail.com)
2017-02-21 11:38:20

@noah_martin please read paul's recommendation

noah_martin (2013islandboy@gmail.com)
2017-02-21 12:05:58

@lia_johansen got it

binnur (binnur.alkazily@gmail.com)
2017-02-21 12:11:28

@noahmartin it maybe worth doing two buttons or same button but w/ hold behavior -- @jack @liajohansen make sure to weigh in the desired behavior

dana_batali (dana.batali@gmail.com)
2017-02-21 12:43:42

@channel: since we have limited time remaining with the robot, I feel the need to raise a bit of a flag. If things have been working “well enough” we need to be careful not to introduce more risk / variance in our attempts to improve things. Code changes should be vetted extra-carefully at this stage.

dana_batali (dana.batali@gmail.com)
2017-02-21 12:50:11

@binnur - iirc the climber has two buttons (one for slow, one for fast). It is currently possible for the driver to initiate climbing in slow mode, then after a grab to switch to high-speed. There was discussion last night on how/whether we can automate this task. There was also discussion on how to auto-stop. At this late stage, we need to carefully evaluate whether the value of additional climbing automation outweighs the risks. This overlaps with the relative priorities for robot access: juggling between more driver practice, more PID tuning for drivetrain and for launcher, and more automation for climber. If it were up to a vote, I think I’d vote that we devote a little more time to auto strategies and then let the drivers practice, practice, practice.

(i’ve always got 2 cents to spare :-))

binnur (binnur.alkazily@gmail.com)
2017-02-21 12:55:49

I will support climber assessment on not needing automation as long as it can be manually managed by drivers safely. I think @riyadth will have input on this. Good to hear we already have buttons programmed

binnur (binnur.alkazily@gmail.com)
2017-02-21 20:38:36

@brian_hutchison please post the failed blue run to this slack channel. Thanks.

binnur (binnur.alkazily@gmail.com)
2017-02-21 20:40:32

^^oppps! Wrong brian!

binnur (binnur.alkazily@gmail.com)
2017-02-21 20:41:36

@brian_hilst please post the failed blue run auto to this slack channel. Thanks!

brian_hilst (brian@hilst.org)
2017-02-21 20:44:14
chrisrin (chrisrin@microsoft.com)
2017-02-21 22:58:59

My 2 cents on climber: manual motor stop seems viable based on testing so far. Just have to make sure the touchpad is full pressed because when motor is cut the rope will slip back to the most recent ratchet position ((maybe 1/8 to 1/4 inch?). I would not prioritize auto-stop on climber over giving drivers more practice time.

coachchee (echee@bisd303.org)
2017-02-22 00:53:24

Did we test the robot with tether ? Don't forget to bring our backup radio ?

binnur (binnur.alkazily@gmail.com)
2017-02-22 13:32:04

@lia_johansen ^^^

dana_batali (dana.batali@gmail.com)
2017-02-22 14:34:03

@binnur, @brianhilst @declanfreemangleason, @jack, @lia_johansen :

I looked into a better refactoring of the auto mode selection process and here's what I came up with. The commit message spells out how we got to this place. I won't submit a pull request for this unless the collective feels its warranted. It's possible there are alternate approaches and I'd encourage declan to peruse this code and develop an opinion on the why and whether we should make such a change or invest in an alternate approach

https://github.com/dbadb/2017-STEAMworks/commit/b1dfe7982be06f2070ed28e139e7ca3d9cbcc8dc

lia_johansen (lilixlucky@gmail.com)
2017-02-22 15:15:59

@coachchee we did not test with tether. Back up radio is on my list

binnur (binnur.alkazily@gmail.com)
2017-02-22 19:23:01

@danabatali the code and approach looks good. thank you for making time for it. @declanfreemangleason @jack @lia_johansen please review and discuss how you want to approach this. It can easily be tested using our backup robot to validate.

jack (jack@phroa.net)
2017-02-25 23:33:32

dana_batali: I like your changes (especially how much clearer the ParameterizedCommandGroup constructors are when broken up like that), however I feel like there's slight lack of flexibility in adding new strategies with all the string constants (we're done doing that anyway, so whatever) and, minor-ly, getAllianceScale's method body could be a call to returnForSide(alliance, -1, 1).

jack (jack@phroa.net)
2017-02-25 23:33:34

I'm going to see what I can put together regarding an object-based parameterized commandgroup; we probably won't be using it - I'm just proving a point

jack (jack@phroa.net)
2017-02-25 23:33:52

maybe if (big if) we have extra time in this week's meetings we can do some testing with it on the second chassis

dana_batali (dana.batali@gmail.com)
2017-02-26 10:59:12

I fully concur that an array of actions is better than an array of pairs of strings.. We do have to tread carefully given the nearness of our first competition

dana_batali (dana.batali@gmail.com)
2017-02-26 11:00:34

But the more pressing problem pertains to how/where to filter the recording list to present a UI before the alliance is known

jack (jack@phroa.net)
2017-02-26 11:05:52

honestly, with how well declan's is doing we might consider just deleting each recording. further, the ugly timestamp names can be fixed with just a 'mv <filename> "Strategy Description"'

jack (jack@phroa.net)
2017-02-26 11:06:10

(once inside the robot)

dana_batali (dana.batali@gmail.com)
2017-02-26 11:54:26

Certainly, I think we've learned that the recordings grow "stale" over time and so the ones currently present on the robot are likely to be in the stale state. I do think that it's a very handy backup to have this recording capability. And since it's really easy to assert that the file name is the GUI, you're right, all we need to do is choose those names artfully. Still we'll end up with two versions of the alliance-specific recordings. I think we have three choices to help prevent driver error: 1) do nothing, it's not a big deal 2) offer a list of options, then during command construction (at autoInit-time), we could further "specialize" the file name. 3) perform the filtering on the driver station with a snippet of javascript (and a recording naming convention). If, as you suggest, the recording approach may not end up being used, then option #1 is a fine choice.

jack (jack@phroa.net)
2017-02-26 13:51:00

I feel good about just leaving two (if any) replays for the driver to pick: Red 1 Drive - Shoot - Cross Baseline and Blue 3 Drive - Shoot - Cross Baseline

brian_hilst (brian@hilst.org)
2017-02-27 11:04:27

Is there a plan for the programming team for the Tues & Weds afternoon sessions this week?

Clio Batali (cliombatali@gmail.com)
2017-02-27 14:41:56

*Thread Reply:* @brian_hilst Programmers will be meeting after the scouting part of the meeting concludes (around 4:30) in order to go through Lia's agenda. The 2nd chassis will be available for you guys to test new code on until the primary robot is ready to be driven (theoretically about 6), at which point drivers and remaining programmers will run through a few more tests/practice. There will be time on Wednesday for packing and driving as well

coachchee (echee@bisd303.org)
2017-02-27 11:38:28

yes, captains will respond. I hope .

:clio: Clio Batali
riyadth (riyadth@gmail.com)
2017-02-27 13:12:49

CTRE has updated their libraries again. Now up to version 4.4.1.12. The biggest change in the release notes is the use of current measurement, which I think we might be doing in some of our subsystems. We may want to update our workstations and deploy freshly built code to the robot... http://www.ctr-electronics.com/hro.html#product_tabs_technical_resources

riyadth (riyadth@gmail.com)
2017-02-27 13:13:22

The release notes:

CTRE Toolsuite 4.4.1.12 Installer
    CTRE Toolsuite 4.4.1.12 Installer
    Talon SRX Firmware (2.34): Minor modification to start up frame.  This will allow for future features (such as ESD detection).  This will not impact any current use case of the Talon SRX.
    Talon SRX Firmware (2.33): Fixed issue where motion magic halts abruptly due to velocity-to-acceleration ratio exceeding threshold.
    Talon SRX Firmware (2.33): Changed Talon SRX current measurement to round instead of truncate.
    Talon SRX Firmware (2.31): Signal added to Status 8 for CAN driver status.
    Talon SRX Firmware (2.31): Robustness improvement in CAN buffering.  This was not necessary to resolve any issues, this was merely an improvement.
    Talon SRX Firmware (2.31): Various optimizations in the current-draw measurement.  This was not necessary to resolve any issues, this was merely an improvement.
    Talon SRX Firmware (2.31): Solved a possible divide-by-zero condition in the current-draw measurement.  This did not solve any known or reproducible issues.
    Talon SRX Firmware (2.30/10.30): Timing improvements added to correct the regression issues of the previous installer's Talon firmware (X.23).
    Talon SRX Firmware (2.30/10.30): The velocity measurement window will automatically truncate to the nearest supported value (1, 2, 4, 8, 16, 32, 64).
                                     For example, if the robot controller attempts to set a window value of '50', the signal value will be truncated to '32'.
    Function Limitation: As a result of the performance improvements in the current-draw measurement, the current measurement for a given load may be dissimilar to the measurement when using previous firmware.  The difference should not exceed 0.125A, and only occurs near current-draws that are close to a multiple of 0.125A boundary.
    Class Library (FRC Java 2017_v5): Updated comment headers.
    Class Library (FRC C++ 2017_v5): Fixed bug where JNI library was not saving the last error code.
    Class Library (FRC C++ 2017_v5): Updated comment headers.
    Class Library (FRC LabVIEW 2017_v6): Updated Talon Context Help &amp; VI Palette short names.
lia_johansen (lilixlucky@gmail.com)
2017-02-27 13:40:33

@brian_hilst we will be testing auto (on second robot or first), test robot with the tether, label buttons, creat to-bring lists, planning for competition.

Clio Batali (cliombatali@gmail.com)
2017-02-27 14:41:56
} Brian Hilst (https://spartronics.slack.com/team/brian_hilst)
} Clio Batali (https://spartronics.slack.com/team/cliombatali)
✅ lia_johansen, timo_lahtinen
declan_freemangleason (declanfreemangleason@gmail.com)
2017-02-28 18:32:29
declan_freemangleason (declanfreemangleason@gmail.com)
2017-02-28 18:33:38

I'll make a pull request to resolve this soon.

lia_johansen (lilixlucky@gmail.com)
2017-02-28 18:58:54

I merged his pull request.

lia_johansen (lilixlucky@gmail.com)
2017-02-28 19:51:26

@dana_batali : we used your code (seen above in declan's message) and today there were issues where the camera wouldn't switch with the button and sometimes not with the drop down menu. What are your thoughts? And are you going to be at the meeting tomorrow?

riyadth (riyadth@gmail.com)
2017-02-28 20:58:08

*Thread Reply:* lia_johansen: Do both cameras work when loaded in separate web pages? I recall that Dana made shortcuts in Chrome (bookmark bar) on the drivers station.

lia_johansen (lilixlucky@gmail.com)
2017-02-28 21:00:15

*Thread Reply:* We did not try that

lia_johansen (lilixlucky@gmail.com)
2017-02-28 20:12:04

@channel : for tomorrow every programmer does not need to come. Timo, declan, and I will be there. Others do not have to come. We will be working on fixing the camera on the smartdashboard.

brian_hilst (brian@hilst.org)
2017-02-28 20:49:32

@lia_johansen: We might also consider falling back to the prior version if we can't readily find the problem. That might be a good place to start to make sure we have a viable option since the cameras are so critical this year.

lia_johansen (lilixlucky@gmail.com)
2017-02-28 20:51:29

@brian_hilst i agree

dana_batali (dana.batali@gmail.com)
2017-02-28 20:55:28

the dashboard issues are very simple, i guess I'm surprised that the work i offered for the robot code, to fix autonomous selection (and also reduce the likelihood of "RobotDrive not updated enough"), don't appear to have been pulled?

dana_batali (dana.batali@gmail.com)
2017-02-28 20:56:02

that is, my dashboard changes weren't that big of a deal (notwithstanding declan's discovery of a typo).

dana_batali (dana.batali@gmail.com)
2017-02-28 20:56:47

my big concern is for autonomous selection... this is what jack and I were discussing earlier on this thread. Lia, Timo - did you deem these unuseful?

dana_batali (dana.batali@gmail.com)
2017-02-28 20:57:43

(btw: sorry for the smartdashboard typos, i didn't think anyone would pull these beside me)

dana_batali (dana.batali@gmail.com)
2017-02-28 20:58:21

I will attend tomorrow's meeting if you feel that I'll be helpful.

lia_johansen (lilixlucky@gmail.com)
2017-02-28 21:53:32

@dana_batali i believe declan just tested your code from your branch. The selection seems to be working fine. I think it would be helpful that you attend for a bit tomorrow if possible

dana_batali (dana.batali@gmail.com)
2017-03-01 07:57:00

@lia_johansen - i'll try to be there around 3:30pm

lia_johansen (lilixlucky@gmail.com)
2017-03-01 08:02:47

@dana_batali thanks!

lia_johansen (lilixlucky@gmail.com)
2017-03-02 12:56:45

@channel : hey everyone. We have seem to have lost a school laptop charger. Have any of you accidentally brought it home or placed it somewhere?

lia_johansen (lilixlucky@gmail.com)
2017-03-02 15:32:07

@channel : for competition , only @declanfreemangleason @brianhutchison @timo_lahtinen @jack need to bring laptops. Others do not

binnur (binnur.alkazily@gmail.com)
2017-03-02 16:27:34

note - @declan_freemangleason needs to be powered for his laptop -- please make sure we have extension cord as it maybe hard to find plugs

✅ lia_johansen
declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-04 09:40:45

The length of the key is 155 inches

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-04 09:41:25

The length from the baseline to the hopper button base is 10 inches

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-04 09:42:49

The length from the diamond plate to the gear spring not including the part that sticks out ~3in is 110in

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-05 18:26:37

: Get agitator working again : Make gear placement easier for drivers : Make an autonomous program that aligns with the gear : Make autonomous atoms faster

✅ lia_johansen, brian_hutchison
lia_johansen (lilixlucky@gmail.com)
2017-03-05 18:37:45

Agitator is probably mechanics but we will test

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-05 18:38:28

I put it as first priority just in case

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-05 19:39:35

Also a CAN self check

👍 Clio Batali
declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-05 19:39:49

Is probably a good idea

binnur (binnur.alkazily@gmail.com)
2017-03-06 21:23:15

@declan_freemangleason good list, thank you!

binnur (binnur.alkazily@gmail.com)
2017-03-06 21:32:51

@lia_johansen please add following to programmer’s todo list

  • ‘testing for launcher’ w/ autonomous and w/ teleop
  • discuss and gather requirements from drivers (alex+will) on how best to provide driving controls for gear placement — also review how/if they are using the camera for gear placement —> I noted several teams w/ light ring around their camera @dana_batali thoughts?
  • discuss w/ drivers on the speed controls for intake, possibly a button to shift down the robot speed —> noted that as our robot moves fast w/ intake, it also has a tendency to push balls away. can we improve the efficiency of intake by driving slower?
dana_batali (dana.batali@gmail.com)
2017-03-07 08:27:52

*Thread Reply:* binnur: i understood from clio that the back camera was flaky and didn’t offer much value during the entire match. I wasn’t aware of this and we could probably have fixed it, but it all turned out well anyway 🙂.

Regarding light rings, i believe a couple teams (squirrels, …) had a vision assist for gear delivery. This is easier to implement with a mecanum wheel-base, so I would think that it might be beyond our means. Toward the end of the match, Alex was getting much better at gear-delivery, so my guess is that getting the camera working reliably is the priority there. This also overlaps with the gear-pickup problem since visibility on the opposing side was very low. Physical enhancement to the gear holding system (perhaps broadening the holder into a Y shape) would also be nice.

👍 binnur
binnur (binnur.alkazily@gmail.com)
2017-03-06 21:39:08

@liajohansen / @declanfreemangleason — autonomous program for the gear will require some work, including possibly using the backup robot for testing (given time available w/ the actual bot). please make sure to review w/ leadership team for the priorities and plans on how to schedule this work

Clio Batali (cliombatali@gmail.com)
2017-03-06 21:40:25

A couple of quick comments on that list: the back camera wasn't used for the majority of competition because it was flakey/not working (I discussed with Declan, and he seems to have a fix in mind). As for intake, once it was physically fixed after having the bottom bar snap if half, our intake was the smoothest it's ever been - it looked like we were just eating up balls from the ground! Though we may want to fine-tune speeds, this indicates to me that the majority of tweaks with the intake are mechanical

👍 binnur, lia_johansen
Clio Batali (cliombatali@gmail.com)
2017-03-06 21:41:11

Also, the agitator problems are 100% mechanical/motor based, not software

lia_johansen (lilixlucky@gmail.com)
2017-03-06 21:42:32

@binnur i will bring up these points in the upcoming leadership meeting. Thank you.

👍 binnur, Clio Batali
binnur (binnur.alkazily@gmail.com)
2017-03-06 21:45:58

@lia_johansen @riyadth indicated that the FTA recommended using Firefox for roborio dashboard — lets remove the visible links to IE from our driver station and replace w/ Firefox for default browser. Riyadth indicated the Firefox interface was definitely more responsive

lia_johansen (lilixlucky@gmail.com)
2017-03-06 21:46:44

Yeah, it definitely is, will do @binnur

💥 binnur
binnur (binnur.alkazily@gmail.com)
2017-03-06 21:47:51

happy to de-prioritize/remove IE from daily use 🙂

chrisrin (chrisrin@microsoft.com)
2017-03-06 21:52:15

There is also the Edge browser 🙂 - seems like Firefox is best for this application

dana_batali (dana.batali@gmail.com)
2017-03-06 22:21:51

hold it, we are supposed to be exclusively using Firefox. If anyone is using anything else, it has definitely not been approved by the "camera team".

dana_batali (dana.batali@gmail.com)
2017-03-06 22:22:48

@riyadth, @binnur @Clio Batali @declan_freemangleason : were there any signs that we were using anything other than firefox during the competition?

binnur (binnur.alkazily@gmail.com)
2017-03-06 22:23:37

@dana_batali this is specifically for the roborio — as the IE was still in the toolbar, that is what I started when we were troubleshooting the CAN bus — this is why I am suggesting removing IE from any visible toolbars and replacing w/ firefox as default

dana_batali (dana.batali@gmail.com)
2017-03-06 22:24:38

@binnur: ah, hadn't gotten that, thanks for the clarification. I guess we would need to install silverlight plugin for firefox

binnur (binnur.alkazily@gmail.com)
2017-03-06 22:26:39

they used firefox w/ roborio at the competition — so, sounds to me like it works off the shelf

paul_vibrans (pvibrans@tscnet.com)
2017-03-07 05:32:35

@paul_vibrans has joined the channel

paul_vibrans (pvibrans@tscnet.com)
2017-03-07 05:36:51

I just watched 1574 in the Haifa match. Their autonomous goes straight to the first hopper to get more balls then back to the boiler to shoot 30+ balls before auto ends. Their shooter rate is faster than ours.

paul_vibrans (pvibrans@tscnet.com)
2017-03-07 06:21:00

I got that wrong, they shoot from the hopper location so must have a two position shooter.

binnur (binnur.alkazily@gmail.com)
2017-03-07 07:27:31

If I recall, we are at a 2 balls a sec in our shooter and our current autonomous takes about 13.5 sec. To speed anything up will require time w/ the robot to tune it correctly (including setting up a realistic field) . Given that, I recommend optimizing around points and getting closer to the gear delivery during autonomous a better strategy then trying to optimize for more shooting points. @lia_johansen can follow up w/ leadership team on priorities and focus.

lia_johansen (lilixlucky@gmail.com)
2017-03-07 07:37:48

I was talking with Brian H, and he said he can make the shooter speed up more quickly and save us around 15 seconds for the match

dana_batali (dana.batali@gmail.com)
2017-03-07 08:15:45

*Thread Reply:* lia_johansen: we discussed this in the car ride home and we agreed that it’s not a “shoe-in”. That is: brian would need significant time with the robot and can’t guarantee that increasing the speed won’t also decrease accuracy. If the priority is high enough and expectations are tempered it would be something worth exploring. I currently suspect that getting back to where we were in Auburn is the highest priority and this might be followed by addressing the giant numbers of dropped gears that we experienced. These activities represent scheduling difficulties with respect to launcher speedups in that robot access is a precious resource in the next 2 weeks.

lia_johansen (lilixlucky@gmail.com)
2017-03-07 08:18:55

*Thread Reply:* That makes sense. I will still ask what their highest priority is and what we as programmers can do to help

dana_batali (dana.batali@gmail.com)
2017-03-07 08:19:05

*Thread Reply:* And I agree with @binnur that an auto gear-drop would be worth spending programming time on.

dana_batali (dana.batali@gmail.com)
2017-03-07 08:19:59

*Thread Reply:* (indeed we definitely want programming to assist/support the LT to deliver on their priorities!)

lia_johansen (lilixlucky@gmail.com)
2017-03-07 08:20:43

*Thread Reply:* I will give a update after our meeting wednesday. I put binnur's points into my "presentation" for tomorrow.

lia_johansen (lilixlucky@gmail.com)
2017-03-07 07:38:00

I'll bring that up at the meeting

paul_vibrans (pvibrans@tscnet.com)
2017-03-07 07:39:53

I took a closer look at 1574's match record and found they are a good gear boy as well. They can only load balls from a hopper so after two attempts at shooting they switch to gears. It took them half way through quals to get dialed in but then in the finals they could get a whole hopper load in during auto. For our strategy a gear option and a fuel option would be good. The human players on the airship are limiting in a gear only strategy.

binnur (binnur.alkazily@gmail.com)
2017-03-07 10:39:26

Agreed that the human players are the critical path for gear delivery during auto. My goal would be to reduce the time it takes for delivery right after auto (basically do the next action faster) which would gain us about 2-3 secs in heading to the hopper to start our ball cycle

jack (jack@phroa.net)
2017-03-07 10:39:50

go for a side peg, then back out into a hopper?

binnur (binnur.alkazily@gmail.com)
2017-03-07 10:41:43

If u trust we can do a successful delivery, yes. But I am thinking just sit at the gear delivery station and complete that action right after we start teleop vs what we do now (drive to gear, position and drop and then move to hopper).

chrisrin (chrisrin@microsoft.com)
2017-03-07 11:27:52

One thing to consider - more strategy I suppose but relevant: the difference between the number of gears for 3 rotors vs. 4 rotors is so large that 3 rotors is (or will become) pretty easy for 2 gearbots to achieve while 4 rotors is very rare even with 3 gearbots (given traffic). This means it is possible that more times than not that our alliance will not even need our pre-loaded gear to get the three rotors, and it may make sense to not drop off that gear at the beginning & instead focus on fuel immediately (increasing opp for 3 rotors + 3 climbs + the ranking point). And then if it seems like that pre-loaded gear is critical to achieving 3 rotors, THEN drop it off (& possibly sacrifice opp for the 40 kpa / ranking point).

chrisrin (chrisrin@microsoft.com)
2017-03-07 11:28:43

later in match I mean

riyadth (riyadth@gmail.com)
2017-03-07 16:00:14

I like that strategy. I do agree that the 6-gear requirement for the last rotor is very difficult for most teams, and I believe that we had great success this past weekend because our shooter did give us a point advantage over the other alliance when both sides only had three rotors running. I believe that it is also a valid strategy to get three rotors running (only), and don't collect any more gears (all bots on the alliance). Instead, use the fuel points as the tie-breaker, and spend any remaining time in the match in a defensive mode, to prevent the other alliance from getting 4 rotors running.

👍 Clio Batali, finn_mander
dana_batali (dana.batali@gmail.com)
2017-03-07 16:56:23

*Thread Reply:* riyadth: i just shared three threads into strategy… To do this you use the “share” button which looks like a upward sweeping arrow

riyadth (riyadth@gmail.com)
2017-03-07 17:00:56

*Thread Reply:* Thanks for the tip! And for sharing the messages.

riyadth (riyadth@gmail.com)
2017-03-07 16:00:36

I think these comments should be re-posted to the strategy channel. How do we do that?

paul_vibrans (pvibrans@tscnet.com)
2017-03-07 16:07:28

After watching a bunch of matches in other venues I would say there is a common thread of intense defense after the 3rd rotor is turning if the opponents only have two going. There are a number of standard defensive patterns that seem to be evolving.

dana_batali (dana.batali@gmail.com)
2017-03-08 08:49:04

nvidia announced their newest jetson, the tx2, since they are a sponsor, i thought it was worth sharing and having a few students to read about it including marketing, @joncoonan

https://developer.nvidia.com/embedded/buy/jetson-tx2-devkit

Clearly this will have no impact on this year's robot.

chrisrin (chrisrin@microsoft.com)
2017-03-09 12:09:15

I saw a CD thread a while back that revealed there are numerous teams that include a separate monitor for video from the robot on their drive stations, and it made me wonder how usable our video feed is on the laptop monitor for Alex driving. Sight lines are often pretty bad I would guess given the size of the airships, meaning the robot video feed must be used. Would a small separate monitor on the drive stations for video feed from robot help? It may not be terribly difficult to add (says I, who knows very little about how the drive station is set up).

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 14:08:19

It shouldn't be difficult to add, someone should ask Alex what he thinks.

alex_lf (al3xlf@gmail.com)
2017-03-09 15:31:00

Honestly the only thing I need to look at on the screen is the camera view, and that's pretty much what's on there during the match, plus there isn't really room for another monitor. Cool idea though

dana_batali (dana.batali@gmail.com)
2017-03-09 15:33:23

i suspect we can make the image a little larger (10%-ish) if that seems worthwhile. Of course higher priority is just to get the back-view camera working reliably, which I understood that it didn't in Auburn.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 15:37:01

@alex_lf When the login prompt for the camera showed up during the competition did you just press cancel?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 15:39:35

@dana_batali Just fyi: The login prompt was showing up because my changes to fix that never got pulled. Should we make the batch file do that when it gets run and just guarantee that upstream is always working? Or maybe just an out-of-date message on the dashboard?

dana_batali (dana.batali@gmail.com)
2017-03-09 15:42:10

@declan_freemangleason : i hadn't understood that it was a simple as the login prompt. If that was the failure condition, then we hope that your firefox-only-fix will resolve the problem. We won't know without some real validation, but it does seem quite likely... But if it wasn't the failure condition, then we have more diagnostic activities ahead. I think it was @Clio Batali who was tasked with setting up the camera feeds, so let's make sure to get her feedback on this.

alex_lf (al3xlf@gmail.com)
2017-03-09 15:43:32

Honestly clio was the one dealing with the cameras since I was helping will set up the robot

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 15:58:39

@dana_batali It may not be that simple, but it's certainly what I think we should try first.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 15:59:20

@Clio Batali When the login prompt for the camera showed up during the competition did you just press cancel?

Clio Batali (cliombatali@gmail.com)
2017-03-09 19:50:26

*Thread Reply:* I was required to log in periodically for the first day as expected, with a dialogue box and all on the 10.49.15.13 page. Later, about halfway through Saturday, when attempting to connect to that page nothing relating to that camera would load (though it looked like it was trying). Once some sort of prompt did show up on the web dashboard that seemed to authenticate the connection to the d-link from there, and I believe it worked for that match, but that was shortly before the camera stopped working.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 21:19:44

*Thread Reply:* Alright, I suppose we'll just need to test and stop speculating then.

dana_batali (dana.batali@gmail.com)
2017-03-09 15:59:31

Without more info from clio, that seems like the clear choice.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 15:59:52

I agree

dana_batali (dana.batali@gmail.com)
2017-03-09 16:00:18

@declan_freemangleason: do you know that the login prompt was presented?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 16:04:17

The login prompt shouldn't be hidden, login credentials are either cached, or the browser shows the prompt.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-09 16:05:24

However, this process seems to be poorly documented and not very transparent so I'm just going off of observations here.

dana_batali (dana.batali@gmail.com)
2017-03-09 16:07:12

We we do know that Clio was aware of this requirement, though perhaps not as experienced with the firefox variant... That's why I'm inclined to suspect a different problem. I'll stop speculating at this point... 😉

paul_vibrans (pvibrans@tscnet.com)
2017-03-14 07:25:44

I watched an Australian bot at the Southern Cross tournament get 40+ fuel points in six out of seven qualifying matches with a front of the boiler shooter like ours. Their auto routine started with the bot angled parallel with the key line and touching the wall with one corner. When it starts it follows the key line to the hopper release and squares up to the wall as balls pile in. Then it backs up, turns parallel to the wall and moves to the boiler where it turns 45 degrees and drives to the boiler front. It's shooter has more dispersion than ours but it was getting over ten auto points every time. Can our robot be programmed to do this for the regionals?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-14 08:22:18

@paul_vibrans I had this very thing in mind and I took measurements on the field at Auburn for it, but our autonomous probably needs to be faster. Do you have the team number or a video?

paul_vibrans (pvibrans@tscnet.com)
2017-03-14 08:41:06

The team is 4613. They should be on streaming video from the Southern Cross Regional starting 3:00 PM our time.

paul_vibrans (pvibrans@tscnet.com)
2017-03-14 09:00:22

I looked at the match results for 4613 at the Shenzhen Regional, which they won, and the Southern Cross Regional, where they are currently number one, and see that their teleop scores are generally lower than ours and their autonomous scores are two to three times ours because of more balls to shoot. The difference in teleop may be a function of a more gear centric strategy after auto. I still think our shooter is more accurate because of less dispersion of the shots.

lia_johansen (lilixlucky@gmail.com)
2017-03-14 17:39:18

@channel Here is the list of positions for open house tomorrow. If your name is not here, you may help with greetings and directions. Show second Chassis: Lia, Timo Launcher: Brian, Ronan, Jeremy Autonomous: Declan

coachchee (echee@bisd303.org)
2017-03-14 20:29:19
The Blue Alliance
coachchee (echee@bisd303.org)
2017-03-14 20:29:41
} Enrique Chee (https://spartronics.slack.com/team/coachchee)
whobbs1496 (whobbs1496@gmail.com)
2017-03-14 20:38:21

Here is a youtube link for the autonomus shown above https://www.youtube.com/watch?v=k2pWR593tqI

YouTube
} FRC Team 4613 - Barker Redbacks (https://www.youtube.com/channel/UCrPaJ2cDBRG5anDkICpr5rg)
dana_batali (dana.batali@gmail.com)
2017-03-15 14:08:21

the new beaglebone seems nearly equivalent (more powerful in some ways - built-in IMU) than the roborio... https://www.arrow.com/en/products/bbblue/beagleboardorg

Arrow.com
jack (jack@phroa.net)
2017-03-16 19:36:21

binnur: (#strategy) every button on both the xbox and the joystick is in use; we'd need either the third joystick or to replace the (currently duplicated but that might be for a reason) intake buttons on the joystick

jack (jack@phroa.net)
2017-03-16 19:36:45

or maybe the keyboard?

binnur (binnur.alkazily@gmail.com)
2017-03-16 19:37:07

keyboard…! like that idea, easy to map?

jack (jack@phroa.net)
2017-03-16 19:37:33

it seems quite hard to map

binnur (binnur.alkazily@gmail.com)
2017-03-16 19:38:39

K - how about button on the dashboard to start the command?

jack (jack@phroa.net)
2017-03-16 19:41:43

we could try that, I'd have to look and see how command buttons are implemented in the normal dashboard

jack (jack@phroa.net)
2017-03-16 19:42:02

I don't think there's a similar feature natively in pynetworktables

jack (jack@phroa.net)
2017-03-16 19:43:01

(I think the easiest would be to temporarily plug in the spare xbox controller while we're recording and just remove it when we're done)

👍 binnur
binnur (binnur.alkazily@gmail.com)
2017-03-16 20:07:16

I like keeping it easy and simple

binnur (binnur.alkazily@gmail.com)
2017-03-16 20:08:12

adding a smartdashboard button that calls the command should be simple — but maybe confusing with the two dashboards…

jack (jack@phroa.net)
2017-03-16 20:21:16

I'll look in to it

dana_batali (dana.batali@gmail.com)
2017-03-17 10:01:00

i expect that pynetworktables is sufficient to the task of adding a button. Another approach would be to add a new mode selector (perhaps on the dev page), that triggers button reassignments.

dana_batali (dana.batali@gmail.com)
2017-03-17 10:17:05

A quick perusal suggests that a sendable command creates a named subtable that follows a specific convention. Here's the source for putData:

public static void putData(String key, Sendable data) { ITable dataTable = table.getSubTable(key); dataTable.putString("~TYPE~", data.getSmartDashboardType()); data.initTable(dataTable); tablesToData.put(data, key); }

where the key is the name we select for the command

and here is the implementation of Command::initTable

public void initTable(ITable table) { if (mtable != null) { mtable.removeTableListener(mlistener); } mtable = table; if (table != null) { table.putString("name", getName()); table.putBoolean("running", isRunning()); table.putBoolean("isParented", mparent != null); table.addTableListener("running", mlistener, false); } }

Now, from a pynetworktables point of view, the only issue is that subtables aren't explicitly supported, we just need a path-name to the individual subtable elemants (like /SmartDashboard/OurCommand/running), etc.

dana_batali (dana.batali@gmail.com)
2017-03-17 11:06:26

The other related topic is Buttons/Triggers (which we instantiate in OI.java). We could theoretically simulate a button-press in our webapp... Buttons are also represented in network tables and are "polled" by the scheduler. The important table sub-field is called "pressed". There is a class called "NetworkButton" that spells it out.

paul_vibrans (pvibrans@tscnet.com)
2017-03-18 17:11:46

I just watched match videos of Skunkworks and saw that they must back away from the wall at the loading chute to make the gears fall properly. The back away seems uniform from try to try as if it is a programmed move like autonomous. Is something like this possible or helpful for us?

lia_johansen (lilixlucky@gmail.com)
2017-03-18 19:30:43

Hey programmers, autonomous is going to work on the driving part of the improved auto. @declan_freemangleason will be fixing camera and then move to auto. I do not think launcher (or others) people need to come tomorrow (as mechanics will have robot for 3+ hrs)

riyadth (riyadth@gmail.com)
2017-03-18 19:34:05

@lia_johansen Will we be able to test launcher sensors without launcher team members present? I assume there will be some adjustments done to the encoder mounts, and it will be important to check that everything is working correctly.

binnur (binnur.alkazily@gmail.com)
2017-03-18 21:06:57

@lia_johansen if the intention is to shoot more balls than just 10, we need to speed launcher as well — I think we are about 2 balls/sec — launcher team, please verify

brian_hutchison (savingpvtbrian7@gmail.com)
2017-03-18 21:09:02

We're at around 3/sec

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-18 21:16:25

@binnur I think that faster shooting is important, but we won't have any more balls to shoot unless we get faster driving. Do you agree? Do you think a launcher person should be there tomorrow?

riyadth (riyadth@gmail.com)
2017-03-18 21:20:39

I guess we should measure the rate of the shooter to know for sure. And driving faster is the first step to shooting more. However, if we don't get the shooting rate faster, then we probably don't need to drive faster either. I think if the goal is to increase the number of shots fired in autonomous, then we need both a shooter person and a drivetrain person.

lia_johansen (lilixlucky@gmail.com)
2017-03-18 21:32:06

For auto, the launcher will onlyshoot 10 balls @brian_hutchison ? So that needs to be fixed

brian_hutchison (savingpvtbrian7@gmail.com)
2017-03-18 21:34:23

I'll be there tomorrow at 1

brian_hutchison (savingpvtbrian7@gmail.com)
2017-03-18 21:34:48

I'll fix the ten ball limit before then

jeremy_lipschutz (jdlgobears@gmail.com)
2017-03-18 21:53:33

i can come at 1 as well, i have tennis at bhs at 2 so that'll be fine

lia_johansen (lilixlucky@gmail.com)
2017-03-18 22:19:46

@brianhutchison @jeremylipschutz awesome. Thanks! Just stay as long as u are needed/want to

ronan_bennett (benneron000@frogrock.org)
2017-03-18 22:20:21

I can come at one too if needed

riyadth (riyadth@gmail.com)
2017-03-18 22:56:19

FYI, I used a stopwatch while watching us shoot on the Auburn Mountainview videos, and I'm pretty sure we're at around 2 balls per second (or slower) currently.

riyadth (riyadth@gmail.com)
2017-03-18 22:57:34

Maybe someone can check my math. Here is a good one to measure: https://youtu.be/-RhM2_tzKhg?t=7s

YouTube
} FIRST Washington (https://www.youtube.com/user/FIRSTWAVideo)
coachchee (echee@bisd303.org)
2017-03-18 23:22:12

Thanks Brian, jeremy_lipschutz , and Ronan to show up Sun at 1 pm.

paul_vibrans (pvibrans@tscnet.com)
2017-03-19 06:39:39

I timed the video at 2.4 balls per second for balls two through nine. There is a startup delay and an ending delay because the agitator has an empty pocket.

binnur (binnur.alkazily@gmail.com)
2017-03-19 11:42:39

@liajohansen @declanfreemangleason the main question is from the aspect of ‘should we stay at the boiler and shoot all balls during 15sec autonomous, or end it at some point to get 5pts to cross the barrier’

jeremy_lipschutz (jdlgobears@gmail.com)
2017-03-19 11:45:43

@binnur do we not cross the baseline when we open the hoppers?

lia_johansen (lilixlucky@gmail.com)
2017-03-19 11:47:29

I think sos @jeremy_lipschutz

lia_johansen (lilixlucky@gmail.com)
2017-03-19 11:57:51

@binnur i am not sure. I thought we would cross when we open the hoppers? Im not sure though

binnur (binnur.alkazily@gmail.com)
2017-03-19 12:01:00

@liajohansen and @jeremylipschutz this strategy is specifically relating to going to the hoppers first, filling up our storage, shoot, and then cross (or not)

jeremy_lipschutz (jdlgobears@gmail.com)
2017-03-19 12:01:23

@binnur

jeremy_lipschutz (jdlgobears@gmail.com)
2017-03-19 12:01:41

@binnur I don't think we need to cross after we shoot because we cross when we go to the hoppers

riyadth (riyadth@gmail.com)
2017-03-19 12:01:52

Is "crossing the baseline" meaning "on the other side of the baseline at the end of autonomous", or is it "robot crossed the baseline at some time during autonomous"?

lia_johansen (lilixlucky@gmail.com)
2017-03-19 12:02:12

@jeremy_lipschutz i thought so

riyadth (riyadth@gmail.com)
2017-03-19 12:02:14

We should get a rules clarification on that.

Clio Batali (cliombatali@gmail.com)
2017-03-19 12:03:14

Just at some point - the robot only has to break the vertical plane of the line during autonomous (give me a second to find the exact rule)

lia_johansen (lilixlucky@gmail.com)
2017-03-19 12:03:52

I wont be at the meeting today btw

jeremy_lipschutz (jdlgobears@gmail.com)
2017-03-19 12:04:06

For each ROBOT that breaks the BASE LINE vertical plane with their BUMPER by T=0

jeremy_lipschutz (jdlgobears@gmail.com)
2017-03-19 12:04:27

that's what i copied from the matchplay.pdf

binnur (binnur.alkazily@gmail.com)
2017-03-19 12:04:41

sweet!!

binnur (binnur.alkazily@gmail.com)
2017-03-19 12:05:13

then we can just sit at the boiler (unless we want to move towards the gear deliver before auto ends)

Clio Batali (cliombatali@gmail.com)
2017-03-19 12:05:27

Section 4-3 - thanks Jeremy

riyadth (riyadth@gmail.com)
2017-03-19 12:12:37

If we grab a lot of balls (fill our "cargo hold" - what do we call it?) and come back and start shooting, the field management system (FMS) will probably end the launch command at the end of autonomous. Since we may have more balls to shoot, it could be advantageous to keep the shooter running somehow, while the drivers get to the controls. Otherwise the shooter shuts down and has to be restarted, losing a second or so of shooting. This is "optimization" territory, and this suggestion may be not worth implementing. And may not even be legal... Might look cool to have a robot keep going at the end of autonomous. :-)

riyadth (riyadth@gmail.com)
2017-03-19 12:21:35

To pick a time for the "fuel dump" from the hopper, this video is clear and would provide a good approximation: https://youtu.be/-RhM2_tzKhg?t=54s

YouTube
} FIRST Washington (https://www.youtube.com/user/FIRSTWAVideo)
riyadth (riyadth@gmail.com)
2017-03-19 12:22:39

Looks to me like we'd want to wait 4 seconds after pushing the release bar in order to get all the balls.

riyadth (riyadth@gmail.com)
2017-03-19 12:24:48

Of course we can leave earlier, as long as we have enough balls to meet our goal (whatever we set that to be).

brian_hilst (brian@hilst.org)
2017-03-19 12:40:20

Keep in mind that there is a delay in counting balls in the boiler so don't know how many we can get counted before it's over.

binnur (binnur.alkazily@gmail.com)
2017-03-19 13:02:07

@here slow start to he day... Will be in before 2pm (but after 1pm) FYI

riyadth (riyadth@gmail.com)
2017-03-19 17:39:45

FYI, firmware versions between the two robots differ: Main robot (bagged) versions: Firmware: 2.1.0f3 Image: FRCroboRIO2017_v8

Backup robot (bare chassis) versions: Firmware: 2.0.0b86 Image: FRCroboRIO2017_v8

riyadth (riyadth@gmail.com)
2017-03-19 17:40:19

Recommendation: update the backup robot to the latest version. If the latest version is newer than what is on the bagged robot, we should probably upgrade that one too.

riyadth (riyadth@gmail.com)
2017-03-19 17:41:22

(Odd behavior with the network camera on the main robot was fixed by unplugging the RoboRIO from the radio and power cycling the radio. The camera remained "fixed" after re-plugging in the RoboRIO.)

riyadth (riyadth@gmail.com)
2017-03-19 17:42:06

No idea if this has anything to do with the firmware, but the problem seems to be related to the main RoboRIO...

coachchee (echee@bisd303.org)
2017-03-19 18:21:07

Thanks for update on the camera problem.

riyadth (riyadth@gmail.com)
2017-03-19 18:33:05

On Chief Delphi, teams report that the current firmware version (as of February) should be 3.0.0f0

riyadth (riyadth@gmail.com)
2017-03-19 18:33:15

Both our RoboRIOs are then out of date.

riyadth (riyadth@gmail.com)
2017-03-19 18:33:35

We should update the practice bot and see if there are any major hiccups, and if not, we should update the main bot too.

paul_vibrans (pvibrans@tscnet.com)
2017-03-19 19:42:43

The video QM19 referenced by Riyadth is interesting because it shows we could have got a lot more points by shifting the shots more to the right. I suspect the drivers could not tell the difference between balls going behind the smoke stack and balls going in. If this is really the problem, is there a way to correct it?

dana_batali (dana.batali@gmail.com)
2017-03-20 10:51:17

@timo_lahtinen et al: I found that I could add an existing project to my eclipse workspace following these steps:

  1. right-click in the package explorer, select import
  2. select an import wizard: General/Existing Projects into Workspace
  3. select root directory: (I selected ../workspace/2017-STEAMworks.clean
  4. Finish
Ethan Rininger (rinineth000@frogrock.org)
2017-03-22 22:11:47

@Ethan Rininger has joined the channel

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-25 08:20:39

@lia_johansen 8:30 is calibration and measurement

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-26 16:37:02

My priority list in order of importance.

binnur (binnur.alkazily@gmail.com)
2017-03-26 18:12:49

Launcher team, once we figure out the launching issues (theory is that the gunk build up on the launcher wheel caused the problems), let's crank back up the agitator speed - goal is to see if we can do 10ball + gear delivery in auto -- speed + accuracy will matter for success

binnur (binnur.alkazily@gmail.com)
2017-03-26 18:14:32

@lia_johansen please make sure to tag any changes from GP competition

lia_johansen (lilixlucky@gmail.com)
2017-03-26 18:15:18

Okay, so ill change the code rpm 1880 to 1800

lia_johansen (lilixlucky@gmail.com)
2017-03-26 18:15:22

Ans then make a tag

lia_johansen (lilixlucky@gmail.com)
2017-03-26 18:15:26

I made a tag last night

lia_johansen (lilixlucky@gmail.com)
2017-03-26 18:15:32

@binnur

binnur (binnur.alkazily@gmail.com)
2017-03-26 18:16:45

@lia_johansen keep everything from this weekend as is and tag -- when shooter is functioning please then update the speeds and validate we still have reliable shooting w/ higher speed.

binnur (binnur.alkazily@gmail.com)
2017-03-26 18:16:48

Works?

binnur (binnur.alkazily@gmail.com)
2017-03-26 18:17:16

I think we are saying the same thing :)

lia_johansen (lilixlucky@gmail.com)
2017-03-26 18:17:56

I already have a tag i made last night. U want the new rpm 1800? Even if we changed it because of the launcher issue?

binnur (binnur.alkazily@gmail.com)
2017-03-26 18:25:54

When they fix the launcher issue, let's see if we can go back to our prior numbers

paul_vibrans (pvibrans@tscnet.com)
2017-03-26 20:12:33

At Auburn Mountain View we started the tournament with a new shooter wheel. At Glacier Peak we started the tournament with a used shooter wheel that had been cleaned, maybe not enough. Do we have a new wheel that we can put on in Cheney or should we spend the time trying to really clean the old one?

riyadth (riyadth@gmail.com)
2017-03-26 21:23:49

I believe that the "overshoot" condition was the result of a second ball being launched before the flywheel had returned to it's set speed (after the first ball is launched). It may be that the system is oscillating now, as it returns to the set speed, and our shot-to-shot time coincides with the time where the flywheel is going too fast. Assuming our system was properly damped before (returning to set speed swiftly without overshoot), then something physical likely changed that affected the balance. If this is the case, then we should be able to re-tune the launcher PID to the new physical state of the machine. However, if we can't determine what changed, then it may change more at Cheney, and result in PID errors again.

riyadth (riyadth@gmail.com)
2017-03-26 21:27:31

It is possible that the stuff on the shooter wheel is responsible, but I would have expected a more gradual change in performance as the deposits built up. But I think the overshoot of the PID system is likely due to added resistance on the motor/flywheel (friction?), resulting in either a more severe slowing of the wheel during shots, or the need to apply more voltage to compensate for the slowing (to overcome resistance). Maybe lubrication would help?

coachchee (echee@bisd303.org)
2017-03-26 21:37:56

We do have new wheels

riyadth (riyadth@gmail.com)
2017-03-26 21:41:13

For autonomous work (ie, auto gear, shoot+gear, hopper+shoot) I'd like to suggest that the team do some work with the backup chassis (ideally with some weight added). We should be able to test repeatability and accuracy with the subsystems and commands we have, and work on improving both. Final tuning of the commands would probably have to wait for Cheney, but if we start tweaking things there we will probably never get it finished in time.

riyadth (riyadth@gmail.com)
2017-03-26 21:41:29

Is it possible to meet sometime during the week or on the weekend?

riyadth (riyadth@gmail.com)
2017-03-26 21:41:43

Would anyone be interested in working on the problem?

lia_johansen (lilixlucky@gmail.com)
2017-03-27 12:59:14

@riyadth : we are not doing the hopper shoot

lia_johansen (lilixlucky@gmail.com)
2017-03-27 13:01:27

Also we are meeting Wednesday to work on side gear auto

riyadth (riyadth@gmail.com)
2017-03-27 13:37:11

Are you going for side gear AND shoot? Or just side gear?

lia_johansen (lilixlucky@gmail.com)
2017-03-27 15:37:29

Just side gear

lia_johansen (lilixlucky@gmail.com)
2017-03-27 17:10:36

@declanfreemangleason , @brianhilst , @niklas_pruen : There is going to be a meeting this Wednesday from 3 - 6 and programmers will be working with the second chassis working on the side gear autonomous. Launcher people do not need to come as we will not be able to work with the real robot.

binnur (binnur.alkazily@gmail.com)
2017-03-28 09:01:21

@lia_johansen on wed, I will try to stop by at the end of the day - FYI

✅ lia_johansen
brian_hilst (brian@hilst.org)
2017-03-28 16:42:43

I can be there

declan_freemangleason (declanfreemangleason@gmail.com)
2017-03-29 08:19:28

I had an idea that could prove interesting, especially if we feel we need to place more gears: Because we have to retune the lcauncher, what if we reangled and retuned the launcher to shoot from the side gear peg? Because we have to sit there anyway when getting a gear pulled up, and we want to go to the side gear peg anyway in autonomous. This could reduce cycle times, although we don't know if the launcher can shoot that far.

chrisrin (chrisrin@microsoft.com)
2017-03-29 08:37:36

if reangle/retune launcher is an option, then shooting from hopper during auto is another direction to consider - seems like many robots in the 40 kPa club do this & score 20+ fuel in auto

binnur (binnur.alkazily@gmail.com)
2017-03-29 08:41:28

@lia_johansen change of plans - unfortunately I won't be able to make it this evening after all / pls send out a quick update after the meet up. Thanks!

dana_batali (dana.batali@gmail.com)
2017-03-29 08:47:38

i'll be there today

👏 binnur, lia_johansen, brian_hutchison
binnur (binnur.alkazily@gmail.com)
2017-03-29 09:04:08

Awesome @dana_batali

lia_johansen (lilixlucky@gmail.com)
2017-03-29 09:23:04

@declan_freemangleason : interesting idea, but we would have to tune the launcher at cheney. My worry with that is it might not work and could mess everything up.

chrisrin (chrisrin@microsoft.com)
2017-03-29 09:36:20

FYI - other teams are making some bets on changes like this in prep for district champs and worlds: https://www.chiefdelphi.com/forums/showthread.php?t=157194

chiefdelphi.com
riyadth (riyadth@gmail.com)
2017-03-29 11:12:31

There is talk over in the #climber channel about changing the gear ratios on the climber motor (to get a faster climb). Right now the software uses percent vbus for the climber motor (which is the correct control mode to use), and it sets 90% for "fast" climb, and 45% (half of the fast) for the "slow" climb. I think we should re-consider using a ratio to select the slow speed, and instead should pick an actual percentage value. With a faster gear ratio, we probably need to reduce the slow speed below half anyway, because if we turn too fast we may not catch the rope.

riyadth (riyadth@gmail.com)
2017-03-29 11:14:21

Also, with the desire for more speed, I suggest boosting the percentage for fast climb to 100. If we increase the torque on the motor through the change in gear ratio, we are more likely to stall the motor, so giving it 100% will be more reliable than any fraction of that.

riyadth (riyadth@gmail.com)
2017-03-29 11:15:24

And do we report the climber motor current to the drivers on the driver station? It could be good for them to see if current is getting too high, which could mean that the motor has stalled. This really is only useful if they can raise the motor set point to compensate for the stall condition.

riyadth (riyadth@gmail.com)
2017-03-29 11:15:56

(And I also cannot be at the meeting today...) :-(

paul_vibrans (pvibrans@tscnet.com)
2017-03-29 14:08:56

Does the motor controller try to deliver a constant output voltage, a constant output current, or a constant duty cycle? I have always assumed it was voltage.

riyadth (riyadth@gmail.com)
2017-03-29 16:52:36

It is basically a constant output voltage, however it accomplishes this by using a pulse-width modulated signal that switches the output on and off at a high rate (15.625kHz, according to the manual). The pulse duty cycle allows a proportional voltage to appear on the motor terminals.

jack (jack@phroa.net)
2017-03-29 23:39:56

if it turns out that you want to use auto recordings, the buttons on the dashboard will work fine until you want to shoot. to shoot, make a recording, then do these steps on a linux computer: ssh lvuser@roborio-4915-frc.local cd Recordings mv "&lt;latest recording&gt;" "&lt;new name&gt;" exit scp lvuser@roborio-4915-frc.local:Recordings/"&lt;new name&gt;" /tmp/x head -n1 /tmp/x | tr ',' '\n', paste output in a text editor with line numbers (or add | nl to the command) find a part with a bunch of zeroes that looks like when you've parked the robot ssh lvuser@roborio-4915-frc.local ; cd Recordings vi "&lt;new name&gt;", add the line number of the right point to the third line of the file (it's only two lines right now, so add a third with G then o then start typing)

dana_batali (dana.batali@gmail.com)
2017-03-30 07:33:08

@jack - so far I follow you, the only question I have is what to type to signify shooting?

dana_batali (dana.batali@gmail.com)
2017-03-30 07:38:14

@jack, @timo_lahtinen : Jack - can you point us at the ruby script for transposing the csv files for scouting? I was thinking it would be a good idea to convert it to python and check it into the spartronics repo somewhere. Perhaps we need a new (tiny) repo called 2017-Scouting?

dana_batali (dana.batali@gmail.com)
2017-03-30 08:55:32

@declan_freemangleason : could you summarize for this channel what you learned about at Glacier Peak regarding IP-camera/roborio flakiness?

jack (jack@phroa.net)
2017-03-30 09:58:40

dana_batali: the line number of the middle zero of a given cluster of zeroes in the tr output goes on the third line of the real file on the robot. here's the script, I already gave it to ethan and jon; jon has already used it successfully on the computer that will be doing the scouting at the next events. https://gist.github.com/phroa/dcf7ec5e5007bd7d715542167f1e04fc

dana_batali (dana.batali@gmail.com)
2017-03-30 10:03:17

@jack -

  • so the signal to shoot is a 3rd line, whose value is the index into the second line where shooting is to begin?

  • fyi @joncoonan asked @timo_lahtinen to install ruby on his laptop which led to this question... Rather than add a new language to the mix, I suggested that it would be easy to convert your ruby script to python

jack (jack@phroa.net)
2017-03-30 10:04:08

dana_batali: yes, and interesting... so much for the scouting computer in the marketing box

dana_batali (dana.batali@gmail.com)
2017-03-30 10:05:27

@jack - can you point me to an example input csv file?

dana_batali (dana.batali@gmail.com)
2017-03-30 10:07:00

(i wasn't aware of a scouting computer or how it relates to marketing, nor even why @joncoonan requested programming assist, probably just a backup plan...)

joncoonan (jonathancoonan@gmail.com)
2017-03-30 10:08:48

Yeah so the deal with needing timo do install ruby is that he is heading scouting at Cheney and worlds. I can use the marketing laptop which has ruby on it to compile the scouting data and then we won't need Timo's computer I just figured he might want to install it on his personal machine as a backup. If that creates issues for programming we can just use the pit computer

jack (jack@phroa.net)
2017-03-30 11:12:51

oh, @dana_batali, I never specified how to run it: ruby merge_speedscout_17.rb path/to/folder/**.csv &gt; out.csv

riyadth (riyadth@gmail.com)
2017-04-01 09:40:31

Happy Arduino Day, especially to the Bling team: https://day.arduino.cc/

day.arduino.cc
:clio: timo_lahtinen, Harper Nalley
riyadth (riyadth@gmail.com)
2017-04-01 12:22:21

FYI, Binnur and I cannot make it to Cheney next week. Too many things going on at work for both of us.

😢 dana_batali
riyadth (riyadth@gmail.com)
2017-04-01 12:22:49

We will monitor Slack if there are questions about debugging. Let us know if we can help remotely.

brian_hilst (brian@hilst.org)
2017-04-05 20:55:40

@declanfreemangleason @liajohansen What time do you expect to begin testing the new autonomous strategies? Niklas and are planning to come over for that.

lia_johansen (lilixlucky@gmail.com)
2017-04-05 20:57:12

@brian_hilst : we plan on starting testing at 9:30 am. We already tested one of the side gears and it was successful. You definitely do not need to come at 9:30 am. We can keep u updated

💥 binnur
brian_hilst (brian@hilst.org)
2017-04-05 20:59:11

Ok. Thanks!

chrisrin (chrisrin@microsoft.com)
2017-04-08 21:52:54

I wasn't sure whether to share here or Random. This is something Bear Metal threw together using field & robot CADs + the Unity game engine + some programming. They showed it to me while I was browsing the pits. I guess a couple team programmers built it Sunday- Wednesday this week. Anyway, pretty cool - could be a way to create a driving practice sim that anyone on the team could play with. The Bear Metal folks were very nice and enjoyed sharing it - they might even share the source & how to integrate another robot CAD if asked (perhaps post-season).

paul_vibrans (pvibrans@tscnet.com)
2017-04-09 11:37:45

I just watched QF3-2 at McMaster University and one robot shot 10 balls in auto with one obvious miss and still got 10 kPa before teleop. What can we do about inconsistent ball counting?

paul_vibrans (pvibrans@tscnet.com)
2017-04-09 11:39:07

I wouldn't be surprised if some fields consistently count low.

paul_vibrans (pvibrans@tscnet.com)
2017-04-09 11:41:33

Or did I see a team that was able to sneak an 11th ball into their hopper?

riyadth (riyadth@gmail.com)
2017-04-09 11:44:27

I think the trick is to catch the miss as it falls off the boiler, and shoot it again :-)

paul_vibrans (pvibrans@tscnet.com)
2017-04-09 11:57:15

The miss I saw stayed on the top of the boiler in the net. All other shots went in and no hoppers were dumped in auto. There was only one shooter.

paul_vibrans (pvibrans@tscnet.com)
2017-04-09 11:58:37

I wonder if we could get alliance mates to shoot into our hopper as we make our turn toward the boiler.

brian_hilst (brian@hilst.org)
2017-04-20 10:46:04

The Worlds matches have started. We are currently 1:1. Our remaining matches today are scheduled for 12:18pm & 1:24pm. They are roughly on schedule.

Here is a link to our schedule: https://www.thebluealliance.com/team/4915/2017

You can watch live at: https://atthecontrol.com/dashboard/home/HOPPER/4915

If they post videos for prior matches, they should be at: https://www.thebluealliance.com/team/4915/2017#videos

There are 6 matches tomorrow, starting at 6:00 AM

brian_hutchison (savingpvtbrian7@gmail.com)
2017-05-21 17:57:27

https://youtu.be/XfAt6hNV8XM

YouTube
} Brian Douglas (https://www.youtube.com/user/ControlLectures)
brian_hutchison (savingpvtbrian7@gmail.com)
2017-05-21 17:58:01

https://youtu.be/UR0hOmjaHp0

YouTube
} Brian Douglas (https://www.youtube.com/user/ControlLectures)
brian_hutchison (savingpvtbrian7@gmail.com)
2017-05-21 17:59:17

These two videos helped me to understand PID control and I think that they would be a good way to explain PID to beginners for next year

👍 riyadth, declan_freemangleason, lia_johansen, james, Terry
binnur (binnur.alkazily@gmail.com)
2017-05-22 13:26:19

cool - programming leads, I suggest either starting to build a resources list on our github, and/or pin these items to this channel so they don't get lost

james (james@slattery.tech)
2017-05-22 13:29:44

@james pinned a message to this channel.

YouTube
} Brian Douglas (https://www.youtube.com/user/ControlLectures)
james (james@slattery.tech)
2017-05-22 13:29:49

@james pinned a message to this channel.

YouTube
} Brian Douglas (https://www.youtube.com/user/ControlLectures)
chrisrin (chrisrin@microsoft.com)
2017-06-03 11:03:25

A couple fellow Microsoftees who mentor our perennial neighbors and friends, 4911, reached out to me with an opportunity to share and help extend their scouting platform. I believe they also worked with 4663 on it. Anyway, the solution involves inexpensive Android tablets (they use the cheapest Kindle Fires) in the stands + a Windows laptop as a server, with connectivity over Bluetooth. Rose expressed interest, but after receiving more information, what would be required is A) a mentor with who can help with setting up the end-to-end solution set up and B) at least one programmer student who can participate in one of the three areas of student contribution:

  1. Android App – design and UI programming for web/mobile development inclined students
  2. Data movement - pushing data from app to server for students who are inclined for "systems" work
  3. Tableau / Data Analysis - for statistics / machine learning inclined students

In a response to this message, I will attach a PDF of a more detailed email from one of the mentors. Please let me know if there is interest so I can get back to them. Thanks!

ronan_bennett (benneron000@frogrock.org)
2017-06-03 11:54:55

*Thread Reply:* @chrisrin I'm interested in helping out, but don't have any experience in JSON, SQL, databases etc. I'd have to learn as I went along, which I'd definitely be willing to do.

chrisrin (chrisrin@microsoft.com)
2017-06-03 12:06:30

*Thread Reply:* I recall interest in cultivating Tableau skills on the team, and I suspect it would be a quicker area to ramp up on than the multi-layer Data Movement area - maybe Tableau would be an area for someone to jump in with 4911? Get the Tableau tool and do online training to start perhaps?

chrisrin (chrisrin@microsoft.com)
2017-06-03 12:10:01

*Thread Reply:* @ronan_bennett It would be really great useful stuff to learn, but there will be a learning curve. Let's see if others are also interested & in what areas, and then we can have a dialog with Anne and Johan from 4911 to see if students need to have deep skills coming in or if they can be less experienced.

ronan_bennett (benneron000@frogrock.org)
2017-06-03 12:11:23

*Thread Reply:* @chrisrin Ok, sounds good.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-06-03 12:13:23

@chrisrin I would definitely be interested in contributing

declan_freemangleason (declanfreemangleason@gmail.com)
2017-06-03 12:14:07

For anyone else who wants more information, https://github.com/frc4911 is their GitHub.

GitHub
james (james@slattery.tech)
2017-06-05 13:08:54

How come the Spartronics Git Organization does not have any members public so they have the org on their personal profile? I think it would be a cool way to show others that you are a part of the team.

riyadth (riyadth@gmail.com)
2017-06-05 14:09:34

Good point! It turns out each contributor gets to set themselves as "public" or "private", with the default being private. I just made myself public (because I do want the Spartronics logo on my Github profile page).

riyadth (riyadth@gmail.com)
2017-06-05 14:10:14

To make the change, go to the Spartronics Github page, click on "People" tab, and then click on your name. In the box on the left side of the page will be a selector to choose public vs. private.

james (james@slattery.tech)
2017-06-05 14:11:38

Ah, alright. Could someone with access to merge these do so: https://github.com/Spartronics4915/developers_handbook/pulls would be greatly appreciated 👍:skin-tone-2:

GitHub
binnur (binnur.alkazily@gmail.com)
2017-06-05 16:19:26

I set a reminder for myself to do it tonight, unless someone gets to it faster ^^ @jack @declanfreemangleason @liajohansen -- (with this said, I don't recall the permissions either 🙂 )

chrisrin (chrisrin@microsoft.com)
2017-06-05 21:59:19

I received some more information about the scouting app collaboration opportunity from the other Microsoft 4911 mentor, Anne. So I tacked the info on to what Johan already provided. Technology Summary

chrisrin (chrisrin@microsoft.com)
2017-06-05 22:06:51

Tech summary for the 4911 scouting app collaboration opportunity:

  1. Android programming (Java) for the app and Bluetooth client
  2. C# .NET programming for the server
  3. Tableau for the analytics

So far, I've heard interest from @rosebandrowski (not very interested in programming - maybe Tableau?), @ronanbennett (Java/Android maybe - any Android experience), and @declan_freemangleason (maybe the end-to-end solution set up? or Java/Android). Rose, there was mention of graphic design, but the catch is there is a little bit of programmy work that goes with it. I recommend reading the updated PDF above.

Any mentors have .NET/C# knowledge that could help with that part?

rose_bandrowski (rose.bandrowski@gmail.com)
2017-06-05 22:07:39

@rose_bandrowski has joined the channel

rose_bandrowski (rose.bandrowski@gmail.com)
2017-06-05 22:09:16

If I'm needed for graphics that is fine. I've taken ap computer science (java), know html quite well, and a little C++ and javascript. I just hope that people more interested in programming than me take it on.

chrisrin (chrisrin@microsoft.com)
2017-06-05 22:12:24

I have a few questions for Anne and Johan on the opportunity, like time commitment, when the collaboration will take place, how it will be done (e.g. video calls + Slack). If others have questions, please reply to this post, and then I can send them one bundle of questions from us.

jack (jack@phroa.net)
2017-06-05 22:12:28

I'm moderately familiar with android, much more with java in general. I'd be open to working with C# but I'd have to see if I can set up some kind of Mono dev environment

chrisrin (chrisrin@microsoft.com)
2017-06-22 00:47:14

Java-based vision solution from stronghold... could be worth checking out: https://www.chiefdelphi.com/forums/showthread.php?threadid=142173

chiefdelphi.com
coachchee (echee@bisd303.org)
2017-06-22 11:18:04

Thanks Chris . Programmers check it out .

chrisrin (chrisrin@microsoft.com)
2017-07-17 11:08:36

I have some questions about how controls have been set up to work the past couple years... For Helios, with the Xbox controller, is it set up so the two joysticks on the controller are used, one controlling throttle of the left side & the other controlling throttle of the right? And if you want to turn while driving, you give one side more throttle than the other (i.e. like a differential/tank drive)? Or were throttle and steering separate, and the software figured out the differential? I'm similarly curious about how the flight stick controller works both with Helios and with Ares. Throttle and steering combined, supported by software I assume. Thanks

riyadth (riyadth@gmail.com)
2017-07-17 14:30:56

We have used "arcade drive" on our robots so far. That uses a single joystick to control both forward/reverse and steering (having the software take the x/y input from the joystick and figure out the relative speeds of the motors). This is a module provided by FIRST, and is fairly easy to implement. Our only modification to the standard usage (as far as I know) is to implement a "throttle" control that sets the maximum speed of the motors, allowing for more control of the robot in certain applications. Oh, and we also implemented a "reverse" mode, where the front of the robot becomes the back, from the driver/joystick point of view.

riyadth (riyadth@gmail.com)
2017-07-17 14:33:24

"Tank drive" is an alternative control scheme, also provided as a module by FIRST. That allows two joysticks to independently control the left and right motors. It is not to be confused with tank "treads" vs. wheels (unfortunately it is often confused in general conversation...). We never used that mode, due to driver preference (or due to the lack of trying it...)

chrisrin (chrisrin@microsoft.com)
2017-07-17 16:16:50

Thanks, Riyadth!

chrisrin (chrisrin@microsoft.com)
2017-07-19 07:39:23

Follow-up on the above... After doing some reading, it looks like the "single stick" arcade drive is commonly used with the flight stick style controllers. With Xbox & similar controllers, it seems, a more common approach may be to control left/right axis with the right side mini-joystick and forward/backward axis (i.e. throttle) with the left side mini-joystick. Some also seem to like adding on extenders (commonly used by FPS gamers) to the mini joysticks for added travel/precision. With tank drive, as one might assume, teams use two identical joysticks, ranging in size from controllers with two side-by-side mini joysticks (PS2-like?) up to two full-size flight sticks.

peter_hall (llahnhojretep@gmail.com)
2017-07-29 10:41:50

@peter_hall has joined the channel

chrisrin (chrisrin@microsoft.com)
2017-07-30 10:11:22

Hi, I was talking with Peter and Samantha during the BARN session yesterday about potential projects & raised the idea again about custom operator controls like these: https://1drv.ms/f/s!AikCDwtdoW5Lqj66386jgdCO_tXj. One project could be to build a proof of concept for Helios's controls in the preseason... I looked at the operator flight stick, and the labels are below - would one of you please confirm what each operation is that is not marked understood? Thanks... FLIGHT STICK HELIOS CONTROLS Lower left: on and off (intake I assume?) Lower center - left: reverse (not sure - also intake?) Lower center - right: slow (intake?) Lower right: climb and off (understood) Stick - left: one shot (understood) Stick - center: launch (understood - both hopper & shooter, right?) Stick - right: unjam (understood - hopper) Stick - lower center: stop (understood - hopper/shooter) Trigger do anything?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-08-03 12:12:18

If you have the time, I highly recommend this video: https://youtu.be/8319J1BEHwM It covers a lot of information on complex autonomous, mostly on motion planning, trajectory computing and following, but also a little on vision.

👌:skin-tone-3: james, Terry
chrisrin (chrisrin@microsoft.com)
2017-08-05 18:11:33

For a while at robotics, Samantha and I discussed the custom operator controls idea further. Here's a concept that came out of that...

peter_hall (llahnhojretep@gmail.com)
2017-09-08 14:07:41

Looks great

peter_hall (llahnhojretep@gmail.com)
2017-09-08 14:08:26

That would be a great preseason project or even something that we could work on during build season

chrisrin (chrisrin@microsoft.com)
2017-09-12 19:27:30

Related to controls, there are numerous interesting drive team capabilities teams have utilized discussed in this thread. Lots of possibilities... https://www.chiefdelphi.com/forums/showthread.php?t=158337&highlight=Fpv

chiefdelphi.com
paul_vibrans (pvibrans@tscnet.com)
2017-09-12 20:55:52

For what it is worth, the US Coast Guard requires the gauges in their helicopters to be mounted on the instrument panel so that the pointers point straight up when the indicated parameter is at the correct operating value. At a glance, a pilot can tell if something is not right and to some extent how big the problem is.

chrisrin (chrisrin@microsoft.com)
2017-09-17 11:30:32

Looking around at what kinds of sensors teams use along with programming to aid navigation & driving. There are a couple intriguing navigation IMUs (inertial measuring unit, I think), and I'm wondering if we've ever taken a look at them. They are: A) NavX MXP and B) CTRE/Gadgeteer Pigeon. These units bundle multiple sensors together, and as far as I can tell, teams with swerve drives often use these to constantly track orientation and thus simplify steering for drivers. It seems like there should be other uses, even without using swerve. Both are on the AndyMark controls parts page: https://www.andymark.com/Controls-s/262.htm Thanks

andymark.com
paul_vibrans (pvibrans@tscnet.com)
2017-09-17 16:18:51

AndyMark is also selling an Analog Devices Gyroboard, a single axis rate gyro. The description says one of these is in the 2017 Kit of Parts so we should have one somewhere.

Terry (terry@t-shields.com)
2017-09-19 11:58:28

Ha! This is awesome. I'm working with a FIRST LEGO League rookie team and they are just discovering the gyro sensor that comes in the LEGO robot kit. They have already learned how valuable it is --- and how it has some drawbacks that programmers have to compensate for (namely, calibration and lag). As a side note on the LEGO gyro device, sometime over the last couple of years LEGO changed the gyro sensor without any announcement. It appears the newer gyros are now dual-axis but LEGO has not fully utilized the dual axis functionally yet. However, how you calibrate the old vs. the new requires different programming!

dana_batali (dana.batali@gmail.com)
2017-09-22 15:32:56

@chrisrin: we have been using a 9 degree of freedom imu for the last couple years. This is how we know we're driving straight or turning a specific angle (as for autonomous)

randy_groves (randomgrace@gmail.com)
2017-09-29 20:33:50

@randy_groves has joined the channel

mrosen (michael.rosen@gmail.com)
2017-10-03 17:29:20

Wait. What? Last year’s robot had an IMU to help it drive straight? Learn something new every day!

I don’t remember anyone working on that.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-03 17:55:49

@mrosen Nicklaus worked on it. There was actually a fair amount of drift that needed to be corrected for; it wasn't full-on PID, but it seemed to work. https://github.com/Spartronics4915/2017-STEAMworks/blob/238800b69e78f706c21e4f4900687bd4fe5e3eb3/src/org/usfirst/frc/team4915/steamworks/commands/DriveStraightCommand.java#L130

GitHub
mrosen (michael.rosen@gmail.com)
2017-10-03 18:37:36

Nice. I get it: keep going in the direction you were originally pointed. I guess you that without the sustained running the pid parameters aren’t that critical

Kenneth Wiersema (kcw815@icloud.com)
2017-10-07 18:22:28
Kenneth Wiersema (kcw815@icloud.com)
2017-10-07 18:23:19

Here's Cad file for Synthesis of Helios, if anyone's interested. I tried to set it up to the ports you guys assigned to the robot, but message me if there's problems with the file, and whether the file works to begin with. I did run into some odd things with just driving it with the software.

Kenneth Wiersema (kcw815@icloud.com)
2017-10-07 18:24:04

No bumpers and I don't think the intake will work, but that's as far as I know

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-07 18:27:52

@Kenneth Wiersema Thanks Kennith! Accurate or not, I think that having a working simulation can be beneficial for testing process, especially in regard to non-accuracy sensitive bugs.

Kenneth Wiersema (kcw815@icloud.com)
2017-10-07 18:28:21

You're welcome

chrisrin (chrisrin@microsoft.com)
2017-10-07 20:07:52

If you go to the Synthesis forum, the dev team will help you out if you encounter problems. Also, Sotabots has uploaded their past couple robots to Synthesis, and you could probably preemptively ask someone on their programming team if they have learned anything that makes the process easier. Good luck! I think it amazing they are planning to have the 2018 game field available by the end of kickoff day.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-08 13:31:13

@chrisrin I know you've have a lot of interest in different control configurations, so I modified the 2017 codebase to allow on-the-fly detailed customization of controls so you and the drivers can find what works best. I used that as an opportunity to try out the Synthesis code emulator, and I've gotten it working enough to test my modifications and fix a few bugs in the new code. There are a number of things that the emulator doesn't support (yet?), most notably CAN Talon, that I had to convert into a bunch of dummy code to get it to run. It did work with our web dashboard, which in this situation was mostly what I needed for testing. Hopefully we can get a chance to test the new stuff out on a real robot so you can let me know what you think. The on-the-fly customization probably won't be useful once we get around to competition season, but I do think that it could help our new drive team in the preseason a bit. (Screenshot is of the dashboard connected to an emulated robot, running the new controller code.)

chrisrin (chrisrin@microsoft.com)
2017-10-08 13:59:31

@declan_freemangleason love the configurability... Standard Xbox controller config is left joystick = forward/backward and right joystick = left/right. To do just that, would the config just be... Rotation (i.e. left/right): +/- sqrt(RJOYX) Forward: +/- LJOYY ...and then the triggers could be used as dampeners for slower, more precise movement?

Thanks for doing this!

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-08 14:04:14

@chrisrin Yeah, that config is exactly right! If we wanted dampening we would just multiply by the trigger values. (E.g. LJOYY**LTRIG)

chrisrin (chrisrin@microsoft.com)
2017-10-08 14:11:09

@declan_freemangleason so cool! This will help the drivers optimize controls much faster than if code mod were required each config change. Only other question is... are there varying value ranges to be aware of for joysticks vs. triggers (e.g. joysticks are 0-255 in each dimension but triggers are 0-99 or something like that)? I also heard the resolution of the Xbox Elite controllers is higher than the standard ones, but I suspect that may not be reflected in the actual numeric values.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-08 14:30:29

@chrisrin I'm pretty sure that all the values for the triggers and joysticks are between -1 and 1, but if they aren't consistent it should be an easy fix in the code. Although the Xbox Elite controller is probably more accurate, I know it has the same precision as a regular Xbox controller.

chrisrin (chrisrin@microsoft.com)
2017-10-08 14:49:21

@declan_freemangleason ok, thanks. so with <1 absolute values, the sqrt function results in a fairly aggressive response curve if I'm thinking about it right. And a slower-than-linear response curve could be achieved using the pow([input#],[power argument]) function, remembering to add the (-1) back to the function for going backward or left. Is only sqrt covered in the code, or can other functions like pow, log, exp be used?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-08 18:49:03

@chrisrin There's currently sin, cos, tan, sqrt, log, exp, and the ^ operator (instead of pow). It's really easy to add anything else if we need it.

coachchee (echee@bisd303.org)
2017-10-08 19:52:29

Thanks Declan and Chris.

chrisrin (chrisrin@microsoft.com)
2017-10-09 10:10:51

One thing that came to mind on the ferry this morning is the opportunity to use a y=n+f(x) form for the controller response in order to use the full range of physical travel of the joystick or trigger, where n=the minimum number >0 for the robot to move. You could find n by simply plugging in constants, starting with a low number like .01 and increasing it until the robot moves when the control is pressed. And from there find a f(x) that will result in the desired response curve where y = 1 = (n + f(x=1)). You can play around with graphing tools like the one below as well to find possible functions to use. In this example, the curve from the squaring function was too gentle and the curve from tan() function was too aggressive, so I averaged the two. Anyway, this is something to potentially a try during controls tuning. Plus it is fun example of applying these math functions to something real. Here's the example: http://www.mathsisfun.com/data/grapher-equation.html?func1=y%3D.2%20%2B%20((x-.115)%5E2%20%2B%20tan(.67**x))%2F2&xmin=-1.450&xmax=1.450&ymin=-1.088&ymax=1.088

riyadth (riyadth@gmail.com)
2017-10-09 15:03:00

@chrisrin I believe this is similar to how we have used a "throttle" control on our full-sized joysticks. There is a small potentiometer paddle that we used to set a scaling factor for the primary joystick x and y axes. That way, the driver could set the scaling factor dynamically.

chrisrin (chrisrin@microsoft.com)
2017-10-09 17:55:22

@riyadth Yep, similar. Declan and I were pondering using the xbox controller joysticks as the main controls (LJOYY=forward/backward, RJOYX=left/right) and then using the paddles as dampeners for finer control when needed. Something like... Forward speed = n + (f(LJOYY) ** (1-(LTRIG0.8))) maybe, where n = # >0 needed for slowest possible speed, f(LJOYY) is the main response curve, and the (1-(LTRIG0.8)) factor results in dampening from 0% to 80% depending on how much the trigger is pressed.

vogl_madeline (voglmad000@frogrock.org)
2017-10-11 21:34:42

@vogl_madeline has joined the channel

adam_rideout_redeker (adamrr100@gmail.com)
2017-10-11 22:19:30

@adamrideoutredeker has joined the channel

mrosen (michael.rosen@gmail.com)
2017-10-12 11:01:20

Geek Humor: https://twitter.com/chronum/status/540437976103550976

A connection to make it more interesting. I saw this only b/c John Carmack retweeted this. Carmack is the guy who did ID software (Quake, Wolfenstein, Doom) and now Oculus VR.

mrosen (michael.rosen@gmail.com)
2017-10-12 11:13:10

About the humor: You need to click on the link to see what the picture is describing: "Multithreading in Theory and Practice." Really funny.

But actually, that was just what came to mind while I was thinking about Binnur's remark about "Herding Cats." This really is a big thing in the software industry. Years ago, I used to work at EDS (in Poulsbo, who knew?). Anyhow, this giant systems integrator decided they need to spend tons of money on a SuperBowl commercial. This is the result: https://www.youtube.com/watch?v=m_MaJDK3VNE

YouTube
} CBS (https://www.youtube.com/user/CBS)
binnur (binnur.alkazily@gmail.com)
2017-10-12 11:19:38

^^ oh, yea -- that pretty much sums up my day job!! just had to course correct w/ my dev team on top priorities. #daily_challenge

👍 mrosen, kaedric_holt
justice_james (jj@j-james.me)
2017-10-12 16:27:14

@justice_james has joined the channel

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-12 16:53:00

@Darwin Clark has joined the channel

Ryan_Olney (olneyrya000@frogrock.org)
2017-10-12 17:01:58

@Ryan_Olney has joined the channel

Ulysses Glanzrock (glanzuly000@frogrock.org)
2017-10-12 18:50:48

@Ulysses Glanzrock has joined the channel

austin_smith (domolord156@gmail.com)
2017-10-13 08:46:14

@austin_smith has joined the channel

Willie_Barcott (barcowil000@frogrock.org)
2017-10-13 13:45:01

@Willie_Barcott has joined the channel

Josh_Goguen (goguejos000@frogrock.org)
2017-10-14 13:39:00

@Josh_Goguen has joined the channel

_charlie_ (courteneystandridge@gmail.com)
2017-10-14 17:49:06

@charlie has joined the channel

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-15 14:26:22

@declan_freemangleason pinned a message to this channel.

} Declan Freeman-Gleason (https://spartronics.slack.com/team/U2UQY5UTH)
Darwin Clark (darwin.s.clark@gmail.com)
2017-10-15 16:28:25

Hi all, this is a presentation that I created about what I did over the summer in regards to the vision platform. I would be more than happy to see a few comments or questions. Thanks!

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-15 16:28:42
Cory_Houser (fishy.hero@gmail.com)
2017-10-15 18:19:20

@Cory_Houser has joined the channel

binnur (binnur.alkazily@gmail.com)
2017-10-15 18:32:43

programming team from last year — any tricks to IMU reset?? Aside from not being 100% sure about what red vs. blue side robot positioning, it didn’t look like our robot was turning correctly in autonomous — @declan_freemangleason please sync w/ rose to make sure we have that straighten out next week. thanks!

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-15 18:48:05

@binnur The BNO055 doesn't provide functionality to zero itself on the fly, so it's position when you get an instance of it (it's a singleton) for the first time is the zero. We get an instance in the constructor of Drivetrain, which is constructed in robotInit. For @rose_bandrowski that means that the robot has to be in its final position when you turn it on, if you want autonomous turning to be accurate.

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-15 18:59:02

When you say 'reset', do you mean reset all the values to 0?(X,Y,Z)? I remember having some functionality like that last year in FTC.

Cruz_Strom (cruzrstrom@gmail.com)
2017-10-15 19:00:30

@Cruz_Strom has joined the channel

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-15 19:07:12

@Darwin Clark When I said the BNO055 doesn't provide the functionality, I actually meant that the code we use to access it doesn't expose the functionality... That means we could add it, but we just didn't have time during the build season.

binnur (binnur.alkazily@gmail.com)
2017-10-15 19:20:03

Hmm. - that is what I thought, but felt maybe we got red/blue swapped or something weird — basically did opposite of expected

binnur (binnur.alkazily@gmail.com)
2017-10-15 19:20:46

We’ll make sure it works next Sunday - and ready for autonomous -

chrisrin (chrisrin@microsoft.com)
2017-10-15 20:46:51

Wish list item for the drive team: lower latency video feed from the robot this year. I did some searching around, and the 7th post in this thread seems promising. Would someone please take a look to see if it is an approach we could use? Here's the link: https://www.chiefdelphi.com/forums/showthread.php?t=156781&highlight=camera+lag+ms And I'll copy in the post for convenience...

This year was the first we really found a need for a first person view from the robot for our driver in order to locate gears on the other side of the field. So we did some research into how to get the best quality back inside 2MB with decent latency.

The only way to get anything we deemed reasonable inside the 2MB was to use h.264 encoding. The Rio just isn't capable of doing this while running robot code and keep any latency (lag) out of the system. The answer was in off the shelf security cameras. Most of them have 720p (20fps) output over RTSP with h.264 encoding. This worked well, but we dropped the resolution down to VGA (still at 20fps) to drop the latency down to just a few ms.

This is the specific camera we used: http://www.microcenter.com/product/4...amera_with_PoE It was a pretty simple job to take it apart and make a smaller enclosure so it fit better.

(thanks -Chrisrin)

chiefdelphi.com
declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-15 21:18:38

I think that's definitely something we should look into... We could also try doing h.264 encoding on a Raspberry Pi or Jetson, which would work nicely with any vision system also running there.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-15 21:21:36

Both the Jetson and the Raspberry Pi apparently have hardware-accelerated h.264, but getting that to work might be quite a can of worms.

mrosen (michael.rosen@gmail.com)
2017-10-15 21:46:55

@Darwin Clark , your exposure to OpenCV is very impressive. My quick look over this suggests you were able to use the Python interface to recognize rectangular features but stopped short of actually (a) identifying the targets of interest or (b) putting them in a reference frame that includes our robot (so we could steer toward it, shoot at it). Is that right? I'd love to hear more. Do you have thoughts on how to address these?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-15 21:51:11

@Darwin Clark What version of OpenCV is your code written for? Also, nice work 🙂

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-15 21:53:00

Identifying targets of interest will vary a lot based on the game. In the event that the game involves rectangular reflective tape, it should be a breeze (such as in Stronghold). My current working idea in regard to driving to a target is to mark the field of view by physically testing it on the robot (watch the live camera view and move an object until it shows up in the camera view)

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-15 21:53:06

Something like this:

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-15 21:53:28

Its a wee bit tedious, and I'm open to other ideas

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-15 21:54:01
chrisrin (chrisrin@microsoft.com)
2017-10-15 22:17:38

Since folks are talking about vision, I thought I would re-share this little arm+gpu+cam solution that started on kickstarter and is now in full production. It's relatively cheap - I'm thinking of getting one to play with and possibly mount on the little track drive robot my son Lucas and I started building this summer. https://www.jevoisinc.com/

jevoisinc.com
Darwin Clark (darwin.s.clark@gmail.com)
2017-10-15 22:38:00

That seems pretty interesting regarding neural net. I really wanted to integrate that into my vision platform, but never ended up finding how. Do you know if the example code is online somewhere @chrisrin ?

chrisrin (chrisrin@microsoft.com)
2017-10-15 22:48:43

maybe here? http://jevois.org/basedoc/group__darknetprof.html I have to be honest; I am interested in the capabilities, but the programming itself is (at this point) beyond me.

dana_batali (dana.batali@gmail.com)
2017-10-15 22:49:20

@Darwin Clark: nice presentation and great that you've developed some real experience here. The python+opencv+jetson is exactly the approach we followed two years ago, so it might be useful to peruse the github from that year. As we briefly discussed, there are significant challenges keeping co-processors running during a match (separate power requirements, extra networking, physical stability of the mount point, more tool chains to keep up-to-date, more software deployment issues, etc). If our vision solution really requires extra compute power and we have time to properly design its location on the robot, then this is still a good way to go. If you look at the repo from a couple years back (stronghold, i think) you'll see that we used the python binding of network tables to communication our vision results back to the roborio. We also had a streaming web server running on the jetson that could be viewed from the driver station.

Other approaches that are available:

  • there is now an opencv option (with java bindings) running on the roborio... The only disadvantage of this approach is that is consumes resources from the roborio and one would need to perform careful analysis of the final vision algorithms running on the cpu with other software running in game configuration to ensure that we're within the capabilities of roborio. Chief Delphi has some proponents of this approach, there are also opponents.

  • last year we purchased a more canned solution in the form of CMU cam: this something similar to what @chrisrin refers to above: namely it has both camera and processor bundled into a single package. The advantage of the CMU cam (aka pixy) is that it's simple algorithm is guaranteed to run at ~50hz and required no real "vision algo dev" on our parts. The downside is that it only recognizes blobs with a strong hue component. In my experience, this may not actually be such a bad limitation. That said, we didn't explore this option in depth last year because vision was determined by captains to be low on the priority list - and less important for our gameplay strategies (where we'd find known-good shooting positions manually).

  • i recall perusing the other option mentioned above (jevois) and since it wasn't available and didn't seem to offer significant advantages over our jetson solution, it wasn't pursued further.

dana_batali (dana.batali@gmail.com)
2017-10-15 22:51:23

Regarding video latency:

  • there are two issues here: frame rate and latency. I believe that the latency issue is really the challenging problem, more than the encoding. That is: we did get reasonable frame rates, it just that they were from 1/2 a second prior which is really tough for drivers.

Regarding ip cameras:

Other tidbits:

  • last year we also experimented with a usb cam and a server on the roborio. We found that the roborio could only serve a singled camera and ended up with two cameras: one an IP camera for the forward direction and the usb camera was used for the back view (or vice-versa).
  • setting up the video server on the roborio wasn't trivial, but I believe @riyadth crossed all the tees on that and it did end up working reasonably well (except the latency problem).

  • for streaming jpeg encoding, we found that chrome browser performed terribly. To this end, we adopted Firefox for the driverstation and it was way better.

  • some cameras offer wide field of views. We have one of these but it wasn't as plug-and-play, since the resolutions it supported weren't standard. We purchased a weird macro lens (for iphones for $10) and glued it onto the ip camera. This worked surprisingly well. It does underscore the point that what a driver might need from a camera might be at odds with what a vision system might need. (it's harder to invert spherical projections than perspective projections and this might make vision a little more challenging).

chrisrin (chrisrin@microsoft.com)
2017-10-15 22:52:44

I thought I remembered you saying we used the IP camera - I shared that thread because the approach to reducing the latency seemed worth investigating - the 640x480 @ 20fps and just a few ms lag sounded good for the drivers - thanks

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-16 08:42:37

@dana_batali In regard to computation power, when I was testing I watched the CPU as well as RAM usage of the Jetson. The CPU seemed to flat line around 70%, with only one or two gigs of ram being used. In short, I'm skeptical of moving to a different platform for lack of computing power, but I don't think its entirely out of the question.

dana_batali (dana.batali@gmail.com)
2017-10-16 09:38:04

@Darwin Clark: i agree that jetson likely offers us the most compute power and that non-trivial vision probably can't be done on the roborio for lack of resources. There is a middle road used by several teams: beaglebone or raspberry pi. These don't have quite the power of the jetson, but consume fewer watts. Not to be taken lightly: the acquisition rate of any vision solution. The usb (2) bus can't deliver 30 fps at DVD quality. Next-gen vision solutions often offer a fast-path from camera to the compute engine. The jetson TX1 has this (and we have one of these, we could explore). In stronghold, we found that we could only get around 10fps from our jetson plus usbcam (even at lower resolution) and this has implications for how to integrate vision target acquisition into the control loop. The CMU camera has the nice property of delivering targets at 50 fps. At this higher frame rate, it's possible to integrate vision targeting in the inner control loop of the robot. But at 10 fps, we would need to have a slug-speed robot to ensure that the closed-loop feedback doesn't cause significant oscillations. We did some bizarre (unintentional) robot dances during the stronghold development. To achieve fast targeting with a slow vision target acquisition rate, the standard approach is to identify the target first (usually as an angular offset to the current heading), then use the imu and pid controller to get to the target quickly.

Anyway it's great that you have both the interest and the skill set to look into these questions. I'll be happy to help!

dana_batali (dana.batali@gmail.com)
2017-10-16 09:46:11

@chrisrin: thanks for the pointer to the camera thread. Reading beyond the initial post you reference I learned that there are a few caveats with this particular approach:

  • they used the custom app to display the image on the driver station. ie: they didn't (couldn't?) integrate it with a typical dashboard. This approach may be viable if and only if the different windows can be layed out so as to not interfere with one another. Clearly it would be preferable to view the video in our dashboard context (and this may be possible since we have a browser-based dashboard).

  • also: the h.264 requires non-trivial computations both to encode but more importantly to decode. This means that this format may not be easy to integrate into an efficient vision solution. In the referenced thread, there is some discussion of all this by the ILAMtitan poster.

chrisrin (chrisrin@microsoft.com)
2017-10-16 10:43:36

@dana_batali thanks for giving it a look. One question comes to mind: would it be appropriate to use the IP camera & h.264 for just the driver video feed and a different camera / solution for vision. As far as displaying the video, I'm going to recommend a second monitor on the refreshed drive station, so the laptop monitor for dashboard & a 2nd one for robot video stream (or vice verse). I think with driver using the xbox controller & moving around to get sight lines, a larger dedicated image (ideally with low latency) is more likely to be used. It'll take some doing, but I believe it is possible.

dana_batali (dana.batali@gmail.com)
2017-10-16 10:51:52

*Thread Reply:* a definite possibility. Pretty much the same as we did last year (two cameras running through the roborio subnet), i wonder whether the two monitor solution might be overkill (and make for a very bulky driver station to carry around). As an aside, have you identified a student to help carry the torch you are holding? I have some small concerns that we're treading into the mentor-led, rather-than student-led territory here.

dana_batali (dana.batali@gmail.com)
2017-10-16 11:00:22

*Thread Reply:* one other point: it's great that you are focused on the driver experience and the key risk associated with camera feeds is how well it will perform on the field. To ensure that all the lab-dev-work isn't wasted it may be wise to look into simulating the network conditions associated with a real match: multiple robots contending for bandwidth via a shared router with Qos scheduling going on.

chrisrin (chrisrin@microsoft.com)
2017-10-16 13:00:56

*Thread Reply:* @dana_batali Yes, Jon C was just recently confirmed as drive coach, and we'll be transitioning things to him, though I'll still be in an advisory mentor role to help make sure there is awareness of options/opportunities. I've talked with Peter already about the prospect of designing a new drive station, including custom controls - haven't talked about 2nd monitor thing - agree we may not need & there are trade-offs.

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-16 11:43:13

@danabatali Because I was spending time on testing and delivering I had not thought about FPS or how that may affect output w/ the Rio. It seems that getting 10 values per second for the targets would be fine (the camera was operating at 10FPS when I was testing). It would depend on how fast networkTables works(I have no idea how it works). To actually be a limitation. What FPS was the camera running at two years ago? @danabatali

dana_batali (dana.batali@gmail.com)
2017-10-16 11:47:36

@Darwin Clark: as i mentioned above, we got 10ish FPS through the python/opencv/jetson system. The usb camera on the jetson was able to deliver nearly 30 fps (at vga-res) with no processing iirc. 10fps for processed results can definitely be made to work, just not in the inside of the control loop... I can perhaps better motivate this point in person.

mrosen (michael.rosen@gmail.com)
2017-10-16 13:35:54

Hey gang, Declan sent out an email on the 14th asking everyone to get a development environment set up. Consider this a gentle reminder. If last year was any guide, it is distinctly non-trivial ... and many of us ended up doing it several times -- so beyond just following the instructions, try to acquire at least a nodding familiarity with what all we're installing and why.

Those of you running Linux, might get lucky and have it 'just work.' If you're not so lucky, I found (Debian Stretch) that installing "libwebkitgtk-1.0.0" fixes an unsatisfied dependency and allows the plugin wizards (File/New Project/Robot Example Project) to run without crashing.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-16 13:54:02

@channel Please see @mrosen's previous message.

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-16 13:54:46

@declan_freemangleason Just based on release dates it looks like I would be using 3.2.0.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-16 13:58:37

@Darwin Clark Ok... Are there any Jetson specific assumptions in the code?

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-16 14:09:33

@declan_freemangleason Not that I remember.

dana_batali (dana.batali@gmail.com)
2017-10-17 11:17:20

*Thread Reply:* here is the primary script we used to perform vision - it allowed us to experiment with a number of different approaches during dev.. Once we settled on one of the algorithms, we created a bootstrapping mechanism that would automatically run this script in the preferred mode..

https://github.com/Spartronics4915/2016-Stronghold/blob/master/src/org/usfirst/frc/team4915/stronghold/vision/jetson/imgExplore2/imgExplore.py

GitHub
Darwin Clark (darwin.s.clark@gmail.com)
2017-10-17 21:44:58

*Thread Reply:* Fantastico, I'll probably end up looking at this around Thursday if I get my IDE set up properly. Still jumping through hoops with Ubuntu.

Mark Tarlton (mtarlton@acm.org)
2017-10-17 19:08:28

@Mark Tarlton has joined the channel

binnur (binnur.alkazily@gmail.com)
2017-10-22 15:23:56

@declan_freemangleason all ok on autonomous. However, any tricks for reverse camera? It is not showing anything.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-22 15:25:37

That's the network camera... Try going to 10.49.15.13/video.cgi

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-22 15:26:27

The username should be admin and the password should be blank.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-22 15:27:32

If that doesn't work then it's an issue with the actual camera, if it does, then it's probably an issue with the dashboard (I would guess that the login process is being weird?).

riyadth (riyadth@gmail.com)
2017-10-22 15:28:56

I cannot access the camera URL you provide (but can access the roborio at 10.49.15.2). The IP address of the camera does not respond to pings. (But the green light on the camera is illuminated)

riyadth (riyadth@gmail.com)
2017-10-22 15:29:06

I will check the Ethernet cable next...

riyadth (riyadth@gmail.com)
2017-10-22 15:34:30

Checked the Ethernet cable. It is secure at both ends, but still no communication with the camera.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-22 15:36:44

Hmm...

riyadth (riyadth@gmail.com)
2017-10-22 15:37:45

I'm thinking of trying a new Ethernet cable. The camera is using one of those thin, flat ones... I don't know if I trust those...

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-22 15:37:52

You could try plugging it into your computer.

riyadth (riyadth@gmail.com)
2017-10-22 15:38:07

True. I will do that when they stop driving.

riyadth (riyadth@gmail.com)
2017-10-22 15:38:21

The IP address is static on the camera, right? No DHCP?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-22 15:41:42

I would think that it would be set to static, otherwise it would have an issue with the radio on the robot (that has DHCP disabled).

riyadth (riyadth@gmail.com)
2017-10-22 16:09:01

We unplugged the camera at the radio and I plugged in to my computer. It worked fine. We plugged it back in to the robot radio, and it worked fine on the driver station. So it was probably a loose connection on the Ethernet at the radio. It's all good now. Thanks @declan_freemangleason for your support.

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-22 16:18:21

Sounds good

coachchee (echee@bisd303.org)
2017-10-22 16:59:54

Thanks guys !

binnur (binnur.alkazily@gmail.com)
2017-10-22 17:22:21

@declan_freemangleason thank you for remote support! :)

rose_bandrowski (rose.bandrowski@gmail.com)
2017-10-22 20:34:20

@binnur since we fixed the camera, do I need to ask Declan to come in on Tuesday after school still?

binnur (binnur.alkazily@gmail.com)
2017-10-22 20:35:26

probably not — though, we didn’t really fix it… we just replugged it couple times 😕

rose_bandrowski (rose.bandrowski@gmail.com)
2017-10-22 20:36:01

Sorry, terminology 😅

binnur (binnur.alkazily@gmail.com)
2017-10-22 20:37:48

no worries — it is the default cntr-alt-delete action of windows 🙂

👍:skin-tone-3: rose_bandrowski
declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-23 14:25:27

@EmmaLahtinen @CoryHouser @charlie @JoshGoguen @WillieBarcott @austinsmith @Ulysses Glanzrock @RyanOlney @Darwin Clark @justicejames @adamrideoutredeker @voglmadeline Just a friendly reminder to try setting up your computer using the instructions in the email I sent to you. If you can't get something to work, we'll be there to help on Wednesday, but we would really appreciate if you try all the steps on your own. Also remember to set your Slack profile picture to a good photo of yourself!

Emma_Lahtinen (lahtiemm000@frogrock.org)
2017-10-23 14:25:32

@Emma_Lahtinen has joined the channel

Ryan_Olney (olneyrya000@frogrock.org)
2017-10-23 14:26:39

I don't have a portable computer that I can bring back and forth, so what should I do about that?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-23 14:28:21

@Ryan_Olney It sounds like you should borrow a team computer, which means that you don't need to do anything right now but setup Slack (which you've already done).

Ryan_Olney (olneyrya000@frogrock.org)
2017-10-23 14:29:02

Ok thx

coachchee (echee@bisd303.org)
2017-10-23 18:13:58

Ryan still setup at home so you can program at home .

coachchee (echee@bisd303.org)
2017-10-23 18:15:56

You can borrow a laptop during the meeting but not take it home .

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-23 18:30:23

@coachchee @Ryan_Olney That's a good point, thank you Mr. Chee.

Ryan_Olney (olneyrya000@frogrock.org)
2017-10-23 19:19:43

Ok will do

Ulysses Glanzrock (glanzuly000@frogrock.org)
2017-10-23 20:54:23

My laptop has been acting up and going through various updates and is just not working right now, so can I use a teem computer for this next meeting?

ronan_bennett (benneron000@frogrock.org)
2017-10-23 21:05:38

@Ulysses Glanzrock Yes you can use a a team computer during the meeting, but you should also bring your own laptop in case we are able to help with the setup

Josh_Goguen (goguejos000@frogrock.org)
2017-10-25 15:27:43

The meeting is at 6:15 today, right?

Josh_Goguen (goguejos000@frogrock.org)
2017-10-25 15:28:11

Wait I got it, never mind

Cory_Houser (fishy.hero@gmail.com)
2017-10-25 16:29:47

My Mac is having some problems downloading the Eclipse software does anyone have any tips for me or should I just check it at the meeting today?

ronan_bennett (benneron000@frogrock.org)
2017-10-25 16:46:41

@Cory_Houser Unless someone else has Mac tips, we'll just sort it out at the meeting

Johan coondog (coonajon000@frogrock.org)
2017-10-25 18:57:19

@Johan coondog has joined the channel

riyadth (riyadth@gmail.com)
2017-10-25 19:40:45

Notes on how to shut down Mr. Chee's Samsung laptop if it won't shut off: https://superuser.com/questions/290132/how-to-force-power-off-of-a-samsung-series-9-laptop

superuser.com
riyadth (riyadth@gmail.com)
2017-10-25 19:40:56

Short story, stick a paperclip in the hole in the middle of the back.

riyadth (riyadth@gmail.com)
2017-10-25 19:42:02

Then you have to plug it in before it will boot again.

Terry (terry@t-shields.com)
2017-10-25 19:44:22

What? No voodoo spells required?

Anna_Banyas (banyaann000@frogrock.org)
2017-10-25 22:34:37

@Anna_Banyas has joined the channel

Martin_Vroom (vroommar000@frogrock.org)
2017-10-26 10:45:28

@Martin_Vroom has joined the channel

binnur (binnur.alkazily@gmail.com)
2017-10-28 07:48:11

@declan_freemangleason do you have drivers for the usb Ethernet?

dana_batali (dana.batali@gmail.com)
2017-10-28 11:28:01

very interesting news for FRC 2018 wrt to programming:

  1. new game data api (opens possibilities for better automated scouting)

  2. python is now acknowledged as an FRC language

  3. a new BLDC (brushless) motor controller and supporting s/w

details here: https://www.firstinspires.org/robotics/frc/blog/2018-beta-teams-brushless-game-specific-data

FIRST
mrosen (michael.rosen@gmail.com)
2017-10-29 16:41:28

I want to share a conversation I had with team 4450 who implemented computer vision: in autonomous mode, their robot looks for the two shiny tape blocks on either side of the peg (for the gear) and uses that to drive the robot so the gear lands on the peg.

It works like this. They install Grip, the WPILib tool that FRC provides for simple CV integration, on a laptop -- totally unconnected to any robot anything. They point the laptop's webcam at the targets (shiny tape) and fuss with the dials until it recognizes the shapes. Then Grip dumps out a Java class -- a .java file -- which derives from wpilib.vision.pipeline. They put this class into their robot code. It has two APIs we care about: process() and getTheArrayOfTargetShapes(). They plug a web cam into the Roborio's USB port. While the robot is driving they have a loop that looks like this:

while True:
   img = webcam.readCurrentImage()
   pipeline.process(img)
   Array&lt;Images&gt; a = pipeline.getTargetImages()
   // make sure there are two images, find the center of each
   //  mark the midpoint between the two images.  That's where the peg is.
   //   calculate the offset between the peg and the center of the image
   //   insert that offset into the drivetrain so we steer toward that.

I obviously am playing fast and loose with the details but he walked me through firing up Grip and using the generated code in the Robot. I was impressed with how straightforward the whole thing seemed. He said it worked pretty well.

dana_batali (dana.batali@gmail.com)
2017-10-30 12:11:01

*Thread Reply:* two questions come to mind:

  1. did it actually work
  2. did they talk about how many visiion targets per second they achieved? If that number isn't high, then they would have to move very slow to prevent oscillations.

next point: a peg-offset coupled with a distance is generally insufficient to deliver successfully. It presumes that the approach angle is approximately perpendicular to the peg. (which can be "guaranteed" either during autonomous or by the driver)

finally: if i understand it, GRIP is just a way to avoid coding and use a simpler (graphical) interface to describe an imaging pipeline, right?

riyadth (riyadth@gmail.com)
2017-10-29 18:52:35

FYI, that's Olympia Robotics Federation (http://orf4450.org/), and the code for their robot is on GitHub here: https://github.com/ORF-4450/Robot10

GitHub
declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-29 20:00:56

After playing around with Grip and Java/Python from scratch, Grip feels like a hassle to me. You have to know what processing steps you want, whether or not you're using Grip, and there isn't much actual complexity in the vision code that you would write by hand. Any of the complexity is in the logic that comes after the processing, but Grip didn't really do a lot in that department. To me it seems like the challenge is just tuning and robot integration, which you really need a robot with a correctly positioned camera and consistent lighting for anyway. I do think that running the vision code on the RoboRio, written in Java, is very appealing. I'm not a big fan of weakly typed or interpreted languages, especially when testing comes at a premium. An actual objective advantage of this is the elimination of a coprocessor/separated codebase. @mrosen Did you ask them about how to RoboRio handled the workload?

mrosen (michael.rosen@gmail.com)
2017-10-29 20:08:03

I specifically asked about the impact of the image processing on the RoboRio: "we have concerns that its very processor-intensive and the processor may already be pretty busy"... "might be... but we didn't notice anything."

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-29 20:10:56

That's interesting... They appear to have integrated it into a control loop, but I never really noticed how well their vision worked though. Control code here: https://github.com/ORF-4450/Robot10/blob/master/src/Team4450/Robot10/Autonomous.java

GitHub
riyadth (riyadth@gmail.com)
2017-10-29 22:06:02

Do we have any scouting data we can refer to, regarding their autonomous gear delivery?

declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-29 22:40:32

I can't seem to find any by searching Slack... Am I missing something, or is the data actually missing?

riyadth (riyadth@gmail.com)
2017-10-30 11:12:48

You may need to follow up with Kenneth. Maybe it is not exported to somewhere searchable yet.

randy_groves (randomgrace@gmail.com)
2017-10-30 11:20:44

boot

randy_groves (randomgrace@gmail.com)
2017-10-30 11:21:27

AAAH! To many keyboards!

dana_batali (dana.batali@gmail.com)
2017-10-30 11:39:30

raspberry pi3 for vision thread: https://www.pyimagesearch.com/2016/04/18/install-guide-raspberry-pi-3-raspbian-jessie-opencv-3/

PyImageSearch
declan_freemangleason (declanfreemangleason@gmail.com)
2017-10-30 17:34:11

*Thread Reply:* @dana_batali What experience do we have using a Raspberry Pi? Do you think that there's any reason to even consider the Jetson?

dana_batali (dana.batali@gmail.com)
2017-10-31 09:18:07

*Thread Reply:* i would guestimate that a jetson tk1 still offers factors more of raw performance, so a team that knows how to leverage that power would have an advantage. That said, I attended some talks at worlds and discussed the topic of neural-net-based vision with a very-on-the-ball student and he said: it may not be worth the trouble, for the simple problems usually presented in FRC. So I would venture to guess that the pi3 is the better solution because:

  1. it runs on 5V - and thus won't require extra buck-boost equipment. We should be able to power it directly off the PDP. (requires max of 2A, so we'd need to verify that).

  2. there is a high-speed camera bus with widely available camera solutions that will allow us to bypass USB issues and increase the framerate. Raspberry pi has a bigger dev community than jetson (albeit more enthusiasts and fewer experts).

  3. as for experience: I have some, i would guess that other mentors do. I have a rasberry pi 2 that I'd be happy to donate, but since the raspberry pi 3 are only $40-$45, I would guess it's better just to fork over the dough for those. I would be happy to donate a couple of those to the team, if there is consensus that we should go that way.

Perhaps the next step is for student leaders to make a determination on this question... An important consideration: are there sufficient programming resources to dedicate to this task. (i believe the answer is that we'd only need @Darwin Clark with oversight from @declan_freemangleason, so resources may not be an issue).

Darwin Clark (darwin.s.clark@gmail.com)
2017-10-31 09:38:25

*Thread Reply:* @dana_batali Resources (specifically manpower) Isn't an issue. I think I'm basically going to be focusing on vision the entire year (So long as the game allows it). The process right now is discussing the platform, which leads us back to your statement, student leadership needs to make a decision.

dana_batali (dana.batali@gmail.com)
2017-10-31 13:34:27

*Thread Reply:* re chief delphi thread: https://www.chiefdelphi.com/forums/showthread.php?threadid=159883

this unit seems comparable to raspberry pi 3 - it may actually be a bit cheaper ($50 includes camera)...

Hardware specs for the jevois: https://www.jevoisinc.com/pages/hardware

https://cdn.shopify.com/s/files/1/1719/3183/files/comparison-to-rpi_1024x1024.png?v=1488568740

Seems like it has less RAM but a faster clock speed.

From an opencv point of view, the unit is quite comparable to a pi3 it just appears to have lots more canned solutions. This will advantage teams who want plug-n-play vision. My predilection is that students learn the nuts-and-bolts of things, rather than learn the skill of good shopping for off-the-shelf solutions. For this reason, I think I'd lean toward the pi3, but there's no doubt that the jevois seems comparable and viable.

Other differences:

pi3 has wire, wifi, bluetooth networking, but jevois has only usb. It's not yet clear to me how jevois would signal the "host" . It appears to have a custom serial terminal interface, so we'd need to write a custom serial client app to poll for results and communicate these to robot. Or perhaps we'd simply integrate the "terminal" into the robot code. Contrast this with network table-based communication using pynetworktables.

chiefdelphi.com
JeVois Smart Machine Vision
dana_batali (dana.batali@gmail.com)
2017-10-31 13:40:32

*Thread Reply:* one other consideration for choice of vision platform - if we want nvidia sponsorship, @joncoonan indicates that we need to make a stronger commitment to delivering solutions atop their platform.

So to summarize we have 5+ vision platform options:

  1. roborio
  2. jetson tk1 or tx1
  3. raspberry pi 3
  4. cmucam (pixycam)
  5. jevois
dana_batali (dana.batali@gmail.com)
2017-11-05 11:27:45

*Thread Reply:* @declan_freemangleason, @Darwin Clark here is some "required reading" for video stream processing on raspberry pi:

https://picamera.readthedocs.io/en/release-1.13/fov.html

dana_batali (dana.batali@gmail.com)
2017-11-21 19:07:43

*Thread Reply:* @declan_freemangleason @Darwin Clark @riyadth - to get to the bottom of the performance difference between jetson and raspi, here's a simple c-file that computes a standard floating-point benchmark... It would be a great exercise to get the numbers on the raspi 2, 3 and jetson... I can contribute 2 and 3, @Darwin Clark: can you do the jetson? If you have time in the next several days (not thursday), let me know if you have any questions:

http://www.netlib.org/benchmark/linpackc.new

dana_batali (dana.batali@gmail.com)
2017-11-21 19:12:37

*Thread Reply:* as an example: here's the output for the default array size (200): on an old imac (i expect the numbers to be slower/lower on raspberry pi and jetson):

LINPACK benchmark, Single precision.
Machine precision:  6 digits.
Array size 200 X 200.
Average rolled and unrolled performance:

    Reps Time(s) DGEFA   DGESL  OVERHEAD    KFLOPS
----------------------------------------------------
     128   0.57  83.38%   2.40%  14.22%  360647.344
     256   1.14  83.46%   2.39%  14.15%  358241.625
     512   2.27  83.41%   2.40%  14.20%  361488.062
    1024   4.54  83.45%   2.39%  14.16%  361188.000
    2048   9.04  83.42%   2.40%  14.18%  362719.188
    4096  18.09  83.41%   2.41%  14.18%  362354.906
dana_batali (dana.batali@gmail.com)
2017-11-21 19:13:05

*Thread Reply:* KFLOPS is what we're after

dana_batali (dana.batali@gmail.com)
2017-11-21 19:14:16

*Thread Reply:* and here is the trivial compilation:

gcc linkapackc.new -o linkapack

dana_batali (dana.batali@gmail.com)
2017-11-21 19:17:53

*Thread Reply:* two other points:

  1. we want to edit this file to focus on single-precision performance. This means we want line 21 of the c file to read

#define SP

  1. optimization matters.. Here's a build line that produces better results on mac:

gcc -O3 linkapackc.new -o linkapack

    Reps Time(s) DGEFA   DGESL  OVERHEAD    KFLOPS
----------------------------------------------------
     512   0.55  84.73%   2.83%  12.44%  1461630.750
    1024   1.10  84.66%   2.83%  12.51%  1462711.125
    2048   2.21  84.70%   2.82%  12.48%  1454971.250
    4096   4.40  84.72%   2.82%  12.46%  1462078.750
    8192   8.84  84.72%   2.82%  12.46%  1454345.500
   16384  17.69  84.71%   2.83%  12.46%  1453299.750
Darwin Clark (darwin.s.clark@gmail.com)
2017-11-21 21:02:58

*Thread Reply:* Alright Dana, you got it. I'll have the numbers by Monday.

Darwin Clark (darwin.s.clark@gmail.com)
2017-11-26 12:54:35

*Thread Reply:* @dana_batali I spent roughly an hour on this, trying to jump over comilation errors. Whenever I compiled the file (I tried three different ways) It would spit out a ton of errors looking like this:

Darwin Clark (darwin.s.clark@gmail.com)
2017-11-26 12:54:51

*Thread Reply: linepack.c:1:6: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘attribute’ before ‘.’ token NPACK.C Linpack benchmark, calculates FLOPS. ^ In file included from /usr/include/stdio.h:74:0, from linepack.c:23: /usr/include/libio.h:306:3: error: unknown type name ‘size_t’ size_t pad5; ^ /usr/include/libio.h:310:67: error: ‘size_t’ undeclared here (not in a function) char _unused2[15 * sizeof (int) - 4 * sizeof (void *) - sizeof (sizet)]; ^ /usr/include/libio.h:338:62: error: expected declaration specifiers or ‘...’ before ‘sizet’ typedef _ssizet _ioreadfn (void **cookie, char buf, sizet _nbytes); ^ /usr/include/libio.h:347:6: error: expected declaration specifiers or ‘...’ before ‘sizet’ sizet _n); ^ /usr/include/libio.h:469:19: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘_attribute’ before ‘IOsgetn’ extern _IOsizet _IOsgetn (IOFILE *, void *, IOsizet); ^ In file included from linepack.c:23:0: /usr/include/stdio.h:319:35: error: expected declaration specifiers or ‘...’ before ‘sizet’ extern FILE *fmemopen (void *_s, sizet _len, const char **modes) ^ /usr/include/stdio.h:325:47: error: expected declaration specifiers or ‘...’ before ‘sizet’ extern FILE *open_memstream (char *bufloc, sizet **sizeloc) THROW _wur; ^ /usr/include/stdio.h:337:20: error: expected declaration specifiers or ‘...’ before ‘sizet’ int _modes, sizet _n) _THROW; ^ /usr/include/stdio.h:344:10: error: expected declaration specifiers or ‘...’ before ‘sizet’ sizet _size) _THROW; ^ /usr/include/stdio.h:386:44: error: expected declaration specifiers or ‘...’ before ‘sizet’ extern int snprintf (char *restrict s, size_t __maxlen, ^ /usr/include/stdio.h:390:45: error: expected declaration specifiers or ‘...’ before ‘size_t’ extern int vsnprintf (char *restrict _s, sizet __maxlen, ^

Darwin Clark (darwin.s.clark@gmail.com)
2017-11-26 12:55:28

*Thread Reply:* That is a shortened version, because the entire error message would be way too long. The three methods of compiling I used were:

Darwin Clark (darwin.s.clark@gmail.com)
2017-11-26 12:56:11

*Thread Reply:* gcc linkapackc.new -o linkapack, cc -O -o linpack linpack.c -lm and simply, gcc linpack.c

Darwin Clark (darwin.s.clark@gmail.com)
2017-11-26 12:57:14

*Thread Reply:* I'm a little thrown off by the file types, becuase on the link you gave (http://www.netlib.org/benchmark/linpackc.new) the linkpack is saved as a C file, and on the command YOU gave, linkpack is saved as a .new file.

Darwin Clark (darwin.s.clark@gmail.com)
2017-11-26 12:57:19

*Thread Reply:* Got any suggestions?

dana_batali (dana.batali@gmail.com)
2017-11-26 13:25:38

*Thread Reply:* mysterious - can you rename the file to end in .c?

mv linpackc.new linpack.c

dana_batali (dana.batali@gmail.com)
2017-11-26 13:51:33

*Thread Reply:* just got some numbers on my jetson tk1... I'll post them presently. Perhaps you can compare my steps:

  1. i used wget to obtain the file via:

wget <http://www.netlib.org/benchmark/linpackc.new>

1a.rename file: mv linpackc.new linpacknew.c

  1. edit linpacknew.c, change #define DPto #define SP on line 31.

  2. compile the file: gcc -O3 linpacknew.c -o linpacknew

  3. run the file: ./linpacknew

Darwin Clark (darwin.s.clark@gmail.com)
2017-11-26 14:09:19

*Thread Reply:* Ahh I see, I added #define SP, but other than that I followed the same steps

Darwin Clark (darwin.s.clark@gmail.com)
2017-11-26 14:09:27

*Thread Reply:* I'll try those steps once I get back home

dana_batali (dana.batali@gmail.com)
2017-11-26 14:11:10

*Thread Reply:* here are numbers i got on my jetson:

Reps Time(s) DGEFA   DGESL  OVERHEAD    KFLOPS
----------------------------------------------------
     512   0.75  86.68%   3.00%  10.32%  1042013.250
    1024   1.50  86.66%   3.00%  10.34%  1042433.500
    2048   3.01  86.66%   3.00%  10.33%  1041903.188
    4096   6.02  86.66%   3.00%  10.33%  1041714.125
    8192  12.14  86.50%   3.14%  10.36%  1033971.625
dana_batali (dana.batali@gmail.com)
2017-11-26 14:22:12

*Thread Reply:* here are numbers from a raspi3:

LINPACK benchmark, Single precision.
Machine precision:  6 digits.
Array size 200 X 200.
Average rolled and unrolled performance:

    Reps Time(s) DGEFA   DGESL  OVERHEAD    KFLOPS
----------------------------------------------------
     128   0.99  89.64%   2.84%   7.52%  191614.531
     256   1.98  89.65%   2.83%   7.52%  191611.078
     512   3.97  89.64%   2.84%   7.52%  191589.609
    1024   7.94  89.64%   2.84%   7.52%  191588.969
    2048  15.87  89.64%   2.84%   7.52%  191589.547

raspi 2  numbers were  in the  151000 range.
dana_batali (dana.batali@gmail.com)
2017-11-26 14:25:20

*Thread Reply:* so it appears that the jetson has significantly better floating point performance as @riyadth guessed:

1042013.250 / 191615.531 = 5.43 faster

dana_batali (dana.batali@gmail.com)
2017-11-26 14:35:49

*Thread Reply:* for completeness, i obtained numbers from beaglebone black:

LINPACK benchmark, Single precision.
Machine precision:  6 digits.
Array size 200 X 200.
Average rolled and unrolled performance:

    Reps Time(s) DGEFA   DGESL  OVERHEAD    KFLOPS
----------------------------------------------------
      16   0.55  92.57%   2.66%   4.77%  41841.398
      32   1.11  92.54%   2.65%   4.81%  41730.660
      64   2.21  92.51%   2.66%   4.83%  41864.852
     128   4.38  92.53%   2.64%   4.83%  42193.164
     256   8.87  92.50%   2.67%   4.83%  41655.719
     512  17.40  92.53%   2.63%   4.83%  42462.664
dana_batali (dana.batali@gmail.com)
2017-11-26 14:44:45
dana_batali (dana.batali@gmail.com)
2017-11-26 16:17:43

*Thread Reply:* here's a work-in-progress spreadsheet looking at the different vision platforms... Happy to make it editable by @Darwin Clark or @declan_freemangleason if you provide me your gmail addresses.

https://docs.google.com/spreadsheets/d/1o-_Ma2ZDfu3egSJ9etxWQ8HrdiuvEqC2BwkLKiq_aqA/edit?usp=sharing

dana_batali (dana.batali@gmail.com)
2017-11-26 16:17:47