Interfacing with arduino from Omni - Raspberry Pi General

Hi
Is there a way to interface with an arduino or similar over SPI or Serial through either GPIO or USB from Omni?
Even better would be to interface directly with a gyro sensor. But I figure that in the absence of the libs from Adafruit etc to drive the sensors, the easiest route would be to put a low most MCU between the sensor and the pi to translate the data into something easily transmitted over a standard protocol.
The reason is to control a Unity app with a gyro for this project since Android is the only way to run Unity on a pi. But I figure a python script can interface with the MCU rather than directly from Unity.
Thanks!

Related

Connecting WiiMote

Hi, I'm a student of IT. My final project is on remote monitoring of patients. One of the features is the detection of falls and I have to use two types of accelerometers (Witilt and WiiMote).
Market applications that use the Wiimote does not work in our i9000.
And my question is: Is it possible with the standard ROM to perform a program that connects to the Wiimote?
If so could someone guide me a little about how to start, or put some link where I can see the starting point for this part of my project?
If it is not possible, would appreciate an explanation of why this happens.
Thanks in advance
Nobody can help me? =)
Can't you just use the accelerometer and orientation sensors built into the phone? If you have to use the external hardware in conjunction with the phone I would say use an Arduino, which has a lot of Wiimote and Wiichuck interface code available, with a Bluetooth module to talk to the phone; look at the Amarino project for interface code.
Alternatively the wiimote IME developer had the problem with the SGS and HTC because they were using a native library as most phones didn't have Bluetooth HID support, and the Bluetooth native library didn't behave the same way on all phones. Because the SGS with Froyo has Bluetooth HID support you may be able to interface with the Wiimote at a higher level and avoid the native libraries, if you take care of its quirky pairing behavior, but you will have to do more work with the Wiimote low-level protocol, which is quite well documented.

[DEV][AADK] Google's Arduino based "ADK" Working on Nexus One

I managed to snag an Android Accessory Development Kit from Google IO.
After wrangling all the necessary code bits together I got the demo code running on the Arduino board and my Nexus One.
The first picture you can see the phone reading the sensors of the demo shield, button states, the temperature, a light sensor and the joystick position.
In the second picture the phone is controlling the led colors and has one relay turned on.
In the last picture you can see that the phone detects the board being plugged in, Android knows there is no app installed for the board and it cannot be found in market.
The Nexus One is running a rooted 2.3.4 ROM from this thread
P.S. Mod's can we get a forum section for Android Accessory Development?
Now for Pics.
Now that is bad a**!!! I was waiting for something like this!!
Hey, I've been attempting to hack in the support into CM7 (nightly, 2.3.4) on my EVO without much success. I've rebuilt the kernel with the necessary flag enabled and ripped the JAR/XML files from the Nexus S update.
I've monitored logcat and seen that it does find the framework JAR (the application wouldn't install otherwise since it's a needed feature) and a dmesg scan shows that the kernel driver is being initialized.
What's happening now is I plug in the ADK and the output from the Arduino board spams that it couldn't get a protocol version from the phone. The phone slows down to a crawl as its probably being spammed with requests from the ADK for a protocol version and doesn't know what to do.
I'm at a loss here as to what I could possibly be missing. If you have any insight through your own endeavors it would be much appreciated.
Great stuff! I was also at Google I/O and picked up an ADK. Can you post the apk file of your app? I'd love to try it out.
And if you're feeling generous...the code?
badass. good ****
Well done mate...
Have tried it out with arduino UNO?...
uh uh, why i wasn't at the Google I/O
Google is doing really good s**t nowadays. I suppose that Arduino will guest in my house in some time as I am fascinated
I am Actually using the ADK with my Nexus One on an Arduino UNO and an USB-Host Shield from Sparkfun, it works equal but only take a fraction to buy it
My first project is an interface for my Audi, at the moment i only use it to start the Motor, but in future i want to try to build a CAN-BUS interface...
Sure here is the compiled ADK.
-Nik
bharathp666 said:
Well done mate...
Have tried it out with arduino UNO?...
Click to expand...
Click to collapse
The Arduino UNO doesn't have native USB Host support onboard like the megas.
You will need a USB Host shield and will have to modify the Arduino code.
SoyoBro said:
Great stuff! I was also at Google I/O and picked up an ADK. Can you post the apk file of your app? I'd love to try it out.
And if you're feeling generous...the code?
Click to expand...
Click to collapse
You can find all the ADK instructions and code here.
Note: When you select your build target you must choose
Target Name - Vendor - Platform - API Level
"Google APIs" - "Google Inc." - "2.3.3" - "10"
Otherwise you will get errors trying to build on the new libs. The instructions on the ADK page wern't very clear about this. Took me awhile to figure that one out.
You will need to update your Android SDK if you don't see those options.
-Nik
As for the point of using an arduino uno, Oleg is providing an newer version of his USB lib, it now works with his shield and the ADK.
My car is almost starting with the ADK, i am only missing a few relays and stuff to get it completed but i posted a proof of concept on youtube. Text an explanation is all german, sorry for that
http://www.youtube.com/watch?v=FlvpMwSxgMg
if there are any questions Ill be here for you
Great stuff. I am really looking forward to this.
But what I don't get is which hard- and software is required:
- is 2.3.4 sufficient?
Answer: YES
- do other devices (running 2.3.4) than the Nexus 1/S work?
Answer: Pending
- will an Arduino Duemilanove board work?
Answer: NO
Thanks for your help, guys
Besides those questions I have one more:
- assuming I have a board that is connected to a power supply. will i be able to charge an Android device when connecting the board to the handset via USB?
Answer: depends on the boards and its power consumption/outlet. Basically, it should
Nikropht said:
P.S. Mod's can we get a forum section for Android Accessory Development?
Click to expand...
Click to collapse
+1 for a dedicated section.
I reckon it's gonna take off as soon as more USB host boards become available
My Arduino Duemilanove works perfectly

[APP] MultiWork Logic Analyzer

Hello! I wanted to share my app I have been developing since about 1 year. It is my first big programming project so any suggestion is welcome to improve my coding!
MultiWork (MW) is a LogicAnalyzer tool, there is a hardware side built with a microcontroller and the software side that is the Android app. Now is simply capable of decoding UART and I2C protocols with a maximum sample rate of 40MHz but new protocols will be added and maybe in the future more sample speed and memory to store the data.
The hardware side is USB and battery powered but currently only works only with USB chargers capable of providing 500mA without USB enumeration. I am working on the new hardware and I hope to get it ready in a few weeks.
The app is made with Android Studio but you can download directly the APK I attached. Here is the GitHub Repository
The API used to decode the protocols is available as a separated jar file which source code is here.
Thank you very much and sorry for my bad English!
hi, thanks for your effort of programming this logic analyzer but Do you have English version.that support USBee AX Pro?

[Q] Using accelerometer to perform actions

So I'm in the preliminary stages of doing an experimental HCI project for a capstone course. I have no android (java) coding experience but I do have background in C++, Python, and QT so I'm hoping to be able to pick up java relatively fast.
Anyways, the idea of my project relies on being able to take accelerometer readings to determine device tilt in the X and Y axis (as documented in the SensorEvent section in the Android Reference guide) and pipe it into an app such as Google Maps to perform a certain action (in this case I'm tying to pan the map).
I know there are games that use the accelerometer to determine tilt that exist out there (eg. Minion Rush when collecting bananas on the unicorn) so it should be well documented I'm hoping....
Is it possible to make an app that's a "skeleton" over Google Maps and when the app detects certain orientations of the phone to execute the appropriate function. eg. To pan in the direction the phone is tilted? It looks like there's the "CameraUpdateFactory.scrollBy(float, float)" function to pan the map that takes 2 floats as arguments can I take the accelerometer readings (X and Y) and plug them into this function? I'm somewhat familiar with QT signals and slots, is there something like that that exists in Android app development so when the phone is tilted past a certain angle it will emit a signal that can be captured and plugged into the above function?
Also can anyone point me to any HCI research papers that may deal with this topic or keywords to be searching for?

Need help getting started with app development (windows, BSD) for stylus input note making app!

I have with experience in VLSI, shell scripting (bash, windows powershell) and basic programming languages like C and Python (Matlab too, which uses a similar syntax).
I want to get into app development but I am completely new to this and unfamiliar with the whole structure - the IO libraries, rendering libraries especially. I mainly like to develop it for Windows 11 (but would be nice if I could make it cross platform, say using something like Vulkan as rendering library).
I wish to make a stylus based note making app (similar to one note, drawboard pdf etc). I wish to know about the libraries available for taking input from surface pen (or other pens). I found that there are a few API available for windows - RealtimeStylus, Windows Ink, etc but I am unable to find anything cross platform. I would like to know if there are open source, or cross platform alternatives. Alternately, I would like to know if it is possible to bypass these and create a custom API myself (including my own algorithms for tracing curves and predicting handwriting etc, at present I am left to use whatever was done in these APIs I think), also possibly making it lower latency within the app. To some extent I realized that pen position is very similar to trackpad input (on the data input to pc side) and then we have tilt and pressure sensitivity data which I'm not sure how it is accessed and used. I remember reading a little about libsdl sometime ago. I would like to know if there are alternatives to libsdl, or if Vulkan supports any alternate libraries.
I would like to know how I could code a program that works on both x64 and aarch64 on windows 11 (Not into 32 bit as I belive my tool will use more than 4GB RAM anyway, as a priority), and as mentioned above, it would be fantastic if I could make it cross platform. What I got from this page is that ( https://docs.microsoft.com/en-us/windows/arm/ ) if I write my program in C++ it should be possible to compile it for both x64 and aarch64 (and make possible optimizations for each of them separately). I am not sure how the whole development environment works - what is dotnet, what is unity, what is xamarin, and differences between each. I found a few code macros in dotnet that help in rejecting certain inputs (could be useful for palm rejection etc) : ( https://docs.microsoft.com/en-us/do...s.uielement.isinputmethodenabled?view=net-5.0 ) (https://docs.microsoft.com/en-us/dotnet/api/system.windows.uielement.ishittestvisible?view=net-5.0 ). As far as I am aware dotnet is cross platform. I might want to make instruction level optimizations to the software (like SSE, AVX, certain 64 bit instructions etc if that gives any hint) and would like to know if the dotnet environment/toolkit has sufficient low level coding possibility to access these. Also, I am curious if it supports vulkan or opengl. Vulkan is written in C++ and supports multiple platforms so I am more inclined to try it.

Categories

Resources