TPA81 + OOPic

Hallo everybody, I'm a teacher in an italian technical school.

As a school project we're building two multi-functional robots (some info at

formatting link
I need to use a TPA81 temperature sensor via I2C with a OOPic (mounted on OOPic-R robot controller board). Do anyone already write codes to use this sensor via I2C?

Thank you! Carlo Valentini

Reply to
Illo
Loading thread data ...

I found a small doc via google:

formatting link
That should be enough for trial and error experimentation. If you need to implement the whole I2C protocol, there are a number of places where you can see it implmented, for instance the Linux kernel, I've seen source code to DOS drivers which implement a parallel port version on the velleman K8000 floppy.

Reply to
mlw

That would be the TPA81 documentation page.

The OOPIC already implements I2C completely (it's a major feature of the device), and supports the creation of I2C "objects" with which a program can interact without knowing the details of the underlying protocol. I2C is the base communication protocol OOPICs use to communicate with other OOPICs and any peripherals that support I2C.

All the original poster should have to do to read pixels back is to write the desired register number to the sensor I2C address, and then read back the pixel value.

Changing the servo position involves writing out the command register address followed by the servo position command.

The devantech site desn't have specific examples for using the OOPIC with the TPA81, but there is a small OOPIC example for use with the SRF08 sonar sensor, which implements a similar read/write register interface.

You might actually be interested in the SRF08 for your robot -- the timing of the sonar pulses is handled entirely on the sensor itself, which will keep you from having to deal with sub-millisecond level timing issues at the main board level (assuming you're going to support sonar), or building discrete logic to deal with the timing. It's kind of expensive, though.

Reply to
the Artist Formerly Known as K

(mounted

BTW, there is an oopic support forum on yahoo ...

formatting link

Reply to
dan

One thing to remember about the oopic-r is that the connector labeled "I2C" is not the one to use. That is the network connector. The one to use is the one behind the D9.

Gerry.

Reply to
Werewolf

Illo wrote: Hello to you too.

The ooPIC can talk to this pretty easily. Do this: Use the bit address of 0xD0 shifted right one bit for I2C.Node Use I2C.Width = cv8bit Use I2C.Mode = cv10bit Use I2C.NoInc = cvTrue

(When I use I2C., I2C is the object, you'll no doubt have some other object name for your device.)

Now you can access any register by setting I2C.Location = and get the value back, or write the value out.

have fun, DLC

: Hallo everybody, : I'm a teacher in an italian technical school.

: As a school project we're building two multi-functional robots (some info at :

formatting link
Now I need to use a TPA81 temperature sensor via I2C with a OOPic (mounted : on OOPic-R robot controller board). : Do anyone already write codes to use this sensor via I2C?

: Thank you! : Carlo Valentini

Reply to
Dennis Clark

I have been thinking about sonar. It isn't a difficult problem. On pentium class machines there is a hardware counter that takes no CPU time. One need only read the counter at start and at end of a cycle. If we read the counter, initiate a pulse, and read the counter at interrupt time, we should be able to get a fairly accurate reading.

Sound travels about one inch per millisecond (884us), or 1131 inches per second at 70F. Typical bad interrupt latency on Linux is estimated at about

10us, even if it is 10 times that number, 100us, that is 1/8 inch. That isn't even including that it is a round trip.

The expected interupt latency is less than a wavelength of the sonar pulse.

Reply to
mlw

BTW: sound typically travels 1 inch/74 us through air, not 1 inch/884ms. That's around 1200 FEET (not inches) per second.

The maximum intterupt latency under linux can occasionally go far higher than what you have stated -- it's going to depend on what else is happening on the machine, and if anything has decided to disable interrupts. Linux isn't hard real-time, and thus there is no (theoretical) upper limit on response time.

BUT (and don't take this as me advocating your approach) I think you're probably OK -- in general, latency is pretty good, especially in 2.6, and the occasional missed sonar pulse shouldn't be an issue -- you should at least be able to tell when you've missed an incoming pulse, and simply use the last value, and most of the time I'll bet you don't miss too many. (No matter what approach you take, sometimes an echo just doesn't come back for a variety of reasons, so all implementations have to handle missed pulses) Also, you are timing the round trip, so you actually have a full 148us per inch.

Bear in mind that you'll want the capability to run multiple sonar units, not just one. You are probably still going to want external discrete logic (and a means to communicate with that logic) to multiplex units if you still want I/O lines left for other purposes. This implies that you probably don't/can't fire multiple units at one time (which is normally fine.)

Not really (see note on speed of sound) -- but it's probably adequate for your purposes.

Reply to
The Artist Formerly Known as K

You mean I can't break the sound barrier by going 55 miles per hour? Isn't that why the "55" is a different color on spedometers of older cars?...

Just an added comment: Unknown real latencies mean that when using a simple home-made transducer setup, you could receive a secondary echo, and think it's the first. Lends something to the saying "objects in sonar are closer than they appear." That's why the SRF08, and the better ultrasnonic sensors like the Polaroid (now SensComp) module, are so handy. They deal with multiple echos and varying gains internally.

Going back to hacking an old Paolaroid camera, this is really still a good way to go. They're cheap -- I routinely find them for $5 or so at thrift stores and garage sales -- and they have been successfully interfaced to various PC ports. If you miss an echo, that's okay, because you can retrigger. But at least you won't get false echos. I've done this under DOS (but not Windows).

*Good* sonar distance measurement is harder than it looks. You have to deal with phasing effects, mechanical and acoustic ringing, and jitter

-- as well as any timing issues. Except for the SRF08 and Polaroid units, which do it internally, measuring the timing of an echo from 40 kHz pulse works okay in the lab, but not always in the real world. I've found, for example, that it helps to also detect the gain of the return echo. You also get better distance by ramping the gain the longer it takes for the ping to return. If you can't directly control output gain you can often "detune" the 40 kHz (or whatever freq. is used) slightly one side or the other.

-- Gordon

Reply to
Gordon McComb

Only in Los Angeles during rush hour, where the terms "fast" and "slow" take on an entirely new meaning.

If he's just looking to do simple obstacle detection (and even mapping) he's almost certainly OK. I use the older devantech units (the ones that just allow you to ping and time the echo line) and in practice, they work extremely well for obstacle detection, assuming the object is actually in view of the unit. There are indeed occasional errors, but these don't much affect the overall performance, even with REGIS, where folks will actively try to send the robot down the stairs or into the server rack.

I've also used these same units for mapping, with a somewhat less success. Part of the problem is that the FOV of the regular units is pretty wide, but another issue comes in with the factors you mention -- funky secondary echoes. I've played with mitigating this a bit supplementing a single ranger used for mapping with a second long-range sharp IR sensor (don't make me recall the model number right now). This has yielded somewhat better results for mapping. I may, however, just suck it up and get an 08 for mapping, also using it in conjunction with the sharp sensor. A polaroid is also a possibility -- but I have so much other cr*p on my plate now I've barely had the time to think about it much.

Cheers -- m

Reply to
the Artist Formerly Known as K

Actually, it's more like 1130 feet/second -- unless you keep it REALLY hot...

Reply to
The Artist Formerly Known as K

Doh! I forgot to multiply by 12" per foot!! My bad. I was surprised at the numbers, I knew they looked really slow but I just didn't see it. thanks for the catch. Anyway, it still works out fairly well:

A once in round trip is (at 70F) 147us. Still assuming a 100us interrupt latency (not application latency) it still means we are worst case 3/4 of an inch difference, we can probbably trim an amount off that by assuming a non-zero latency offset in our calculations.

Don't confuse process latency with "Interrupt Response Time" latency. ISRs are usually measured in microseconds. Process latency can be much larger.

Actually, not only does a pulse come back, but multiple pulses come back. The pulse will bounce off multiple objects, some closer, some further away. It makes for real problems when you have multiple sensors operating on the same frequency and pointed near each other.

Theoretically at least, you could fire more than 8 ultrasonic sensors with the PC parallel port and have minimal OR logic to trigger interupts. (Assuming one-lead ultrasonic modules) Write all lows to the port, write highs, wait for the various bits to go low, wire minimal interupt gating, and update a small register for each bit that changes state.

Easy.

Yea, I was doing the math in my head earlier in the morning (in a waiting room) and was thinking something like 200us per inch (using 600mph -- nice round number), when I got home I put it all in a spreadsheet (but forgot conversion from feet to inches) and thought I was just moved a decimal point in the wrong place earlier. The numbers pull in the slack I was expecting, but doesn't affect the viability.

Reply to
mlw

mlw wrote: [snip]

I'm not -- as I said, should interrupts be disabled for an extended period of time you WILL miss an incoming pulse. Whether this happens or not depends on the other active device drivers, what the system is doing at he time and, I suppose, on the quality of the drivers themselves.

I just don't think you're going to run into this often enough that it will pose much of a problem.

You'd be surprised, though. I've run multiple robots (OK, two) in a relatively small space with minimal problems with everybody using sonar. You can compensate to a limited extent in software.

There is a paper floating around somewhere which describes a method for running multiple robots using sonar with minimal "crosstalk". Can't recall it offhand, though.

I bet I can think of an easier way...

In any case, be aware that many units are two-lead (excuding power, obviously) w/ separate trigger and echo lines. You can add discrete circutry to deal with this, of course. I think this may also preclude using the parallel port (without more supporting circuitry) as an I2C master, no?

I've learned not to do math in my head when not absolutely necessary.

Reply to
the Artist Formerly Known as K

Not sure if you'll get better results with the Polaroid or SRF08 for mapping, as all ultrasonic sensors see parallel walls ahead of the robot as a converging cone. So you have to steer the beastie all around the room in order to make perpendicular readings of walls, and accommodate for arbitrary obstacles within that zone. It's done, but takes time.

A member of our local robot user group, an expert in vision analysis, has in progress a not-to-complex image system for supplementing ultrasonic mapping. It's based on detecting and measuring straight lines, and her demo of it two Saturday's ago was very promising. The thing about most rooms -- ones you'd likely map anyway -- is that they're full of lines made by corners, intersections, and doorways, even some furniture. (Most people, cats, and other such objects aren't made of up of staight lines, so these the system naturally ignores, and leaves for the ultrasonic to detect and include in the map.) By scanning a camera to build a map it makes for faster mapping, without having to drive along the walls.

Alas (for you anyway) it's all based on DirectShow...

This goes along some of our thinking we've talked about here of using a camera with a known field of view, and measuring the angles of the ceiling corners of rooms. Similar intersecting lines are present even if the corners of the walls are obstructed by things like bookshelves, or irregularly-shaped outcroppings (build-in closets, etc.), using structured light or just outline detection. A robot could theoretically map a smallish room without changing its position to get a better view.

-- Gordon

Reply to
Gordon McComb

*anything* is possible when you can program the hardware directly, even a RTOS is subject to device drivers turning off the interrupts. Bad software is bad, right? They way I look at it, I only have a handfull of I/O drivers, I think the biggest risk I have is the USB driver. The disk driver should be OK. The sound driver, I believe, uses PCI busmastering. The network card seems solid as well.

I will however, be hooking up my scope to the parallel port with a signal generator to actually see the latency under load.

Well, that is really one of the underlying principles of the sub-$500 robot. When you think about it, the real world is very messy and not very precise, using a lot of precision to measure things who's basic uncertainty far exceeds your margin of error doesn't make sence.

It's like trying to hit a 1mm diameter target with a paintball. Plus or minus a few centemeters is really about as good as you can realistically get given the nature of what it is you are attempting to do.

Let me guess.....

Interestingly, the I2C bus device does not use the actual data lines of the pport. It only uses the signal lines.

As for the two lead rangers, I've seen those as well, it doesn't really matter too much as I am thinking of a generic I/O port expander board (find cheap or build) that will do what I need and be cheaper (and faster) than the velleman K8000 board I am currently using. I'm not really happy with the K8000 for a robot, just using it to do development for now.

Funny, the numbers I was doing in my head were more accurate than my spreadsheet (until I fixed it.) Just out of high school I was a bank teller, and I could run my balance totals in my head, which was cool because when the adding machine was wrong I could see where I pressed the wrong key.

Reply to
mlw

Well -- this is the basic problem with sonar mapping. You have to do a lot of wandering around initially.

Actually, I swing either way -- but an algrithm would be nice.

If she has any details to pass along, they'd be appreciated. I'd played with using sonar some time ago in conjunction with a WNN image recognition code to help me determine scale. This actually worked pretty well.

Reply to
the Artist Formerly Known as K

Robin's Web site is here:

formatting link
Her over-arching vision software is called Mavis. Not sure how up-to-date the downloads are.

BTW, this is a different Robin Hewitt than sometimes posts here.

-- Gordon

Reply to
Gordon McComb

I'm going to be running a ring of 12 SRF08s through a neural network and seeing what comes out. Hopefully I can get a better picture of the world this way.

-- D. Jay Newman

formatting link

Reply to
D. Jay Newman

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.