Accelerometer or G-Sensor code???? - Windows Mobile Development and Hacking General

Does anyone know of any code whereby I can read from the phones sensors to see what physical orientation it is in?
Or any workarounds?
My apps will be much better if I can add auto-rotation to them.

For c++ native code look http://scottandmichelle.net/scott/cestuff/sensortest.zip
for .Net http://sensorapi.codeplex.com/SourceControl/changeset/view/11841#

Related

mini-s screen orientation change

Hi,
I've recently been handed a project that requires forms based data capture (on a xda mini-s). I dont want to limit users to having to use either landscape or portrait mode when completing the forms.
Is there a way to detect the screen orientation and/or get an event when it changes?
thanks
Andy
what language ?
win32?
.net?
yes good point
.net CF v2
C#
Sorted now - it actually generates a form.resize event
i was just over complicating the task

Touch flow in the .net compact framework

Hi there
I've recently started looking into coding applications for WM6 based on the .net framework SDK (using C# or VB.net alternatively), as I have quite some experience in developing .net windows forms applications. The problem, however, is that handling finger scrolling events is not natively implemented.
Looking through the web, I found this VERY interesting article
http://www.codeproject.com/KB/miscctrl/MouseGestures.aspx
I enables to recognize both simple and complex on-screen gestures, and all is compiled into a single class. Based on this, possibilities are infinite, well beyond left, right,up and down scrolls... meaning that the iphone's zoom, pan etc. finger tricks could very well be implemented ^^
I'm working on a first level implementation of the code for a picture viewer, and thought I would share the links with the community, if there are any .net developers around!
Cheers
Have you checked out the HTC Camera Album?
It currently supports gestures such as full circle and half circles

[VB.Net] DirectDraw or Similar

Hey Guys!
Iam searching for a possiblity to draw on the screen with DirectDraw, Sprites or similar, fast drawing operations(with transperant and so on). Someone knows how it works?
I also want to block the camera and other Hardware keys to use them exclusivly for my programm
thx!!
SciLor
Direct Draw
I Really Want It Too But Microsoft Didn't Build Any Directdraw Library For Vb.net & It Only Supports C++
Is there another API you can use?

Sensor API now supports Omnia's Light Sensor - grab the new source

If you're a developer, you can now easily access the Omnia's Light Sensor through the Unified Sensor API. Have fun coding!
http://www.koushikdutta.com/2008/11/unified-sensor-sdk-now-support-omnia.html
Oops, posted the wrong link.
Thanks Koush!
The code needed some adjustments but finally worked it out.
And now PocketShield is supported as well for Samsung Omnia.

[Q] Using accelerometer to perform actions

So I'm in the preliminary stages of doing an experimental HCI project for a capstone course. I have no android (java) coding experience but I do have background in C++, Python, and QT so I'm hoping to be able to pick up java relatively fast.
Anyways, the idea of my project relies on being able to take accelerometer readings to determine device tilt in the X and Y axis (as documented in the SensorEvent section in the Android Reference guide) and pipe it into an app such as Google Maps to perform a certain action (in this case I'm tying to pan the map).
I know there are games that use the accelerometer to determine tilt that exist out there (eg. Minion Rush when collecting bananas on the unicorn) so it should be well documented I'm hoping....
Is it possible to make an app that's a "skeleton" over Google Maps and when the app detects certain orientations of the phone to execute the appropriate function. eg. To pan in the direction the phone is tilted? It looks like there's the "CameraUpdateFactory.scrollBy(float, float)" function to pan the map that takes 2 floats as arguments can I take the accelerometer readings (X and Y) and plug them into this function? I'm somewhat familiar with QT signals and slots, is there something like that that exists in Android app development so when the phone is tilted past a certain angle it will emit a signal that can be captured and plugged into the above function?
Also can anyone point me to any HCI research papers that may deal with this topic or keywords to be searching for?

Categories

Resources