We are pleased to confirm that the latest version of the S Pen SDK for the Samsung GALAXY Note is now available to download, bringing the Multi-Window feature for the GALAXY Note II. Multiple apps can be loaded in multiple windows, providing more flexibility for the end user and true PC-like multi-tasking wherever it’s needed.
● Multi Window is available on the GALAXY Note II using an S Pen.
- Multi Window and its related APIs were added to improve the usability of apps that utilize the S Pen on the GALAXY Note II.
- The screen can be split vertically or horizontally within apps that use the S Pen without switching to another screen, allowing the user to display related apps on one screen simultaneously.
- Multi Window supports data transaction using the clipboard, as well as drag & drop to allow easy sharing of data between apps that use Multi Window
*Note: GALAXY Note II devices for some regions or carriers may require upgrading to the latest firmware to use Multi Window.
● Unnecessary and duplicate APIs have been deprecated, and the names of some APIs have been changed.
- Deprecated methods can be used in the previous 2.2 version and the current version, but they will be removed in future versions
● Some errors in S Pen SDK 2.2 have been corrected in this release.
The SDK includes 20 sample apps for you to use in your own applications, so you can quickly understand how to make the most of each available method. You can find the samples at http://developer.samsung.com/s-pen-sdk/samples
In addition, there is some extenstion for developers using Unity Engine, a popular 3D game engine. You can integrate the S Pen SDK library into their apps. To make integrating the S Pen and Unity easier, the S Pen SDK Unity Extension is available as a simple “package file” that is easily consumed in Unity as a Unity Plug-in. Download at http://developer.samsung.com/s-pen-sdk/unity-extension
Via
Very nice, I'm definitely going to add support for Multi-Window and S-Pen in my existing apps soon.
Judging from the growing popularity of Samsung devices, it might be worth the effort to implement it (which doesn't seem that big of a deal anyway).
Plus this would give me a chance to take part in the Samsung Smart App Challenge 2013, the venture capital pitch opportunity is enormous.
I think we need a 'house of the dead/time crysis' style game using spen hover and spen button as shoot
maskerwsk said:
I think we need a 'house of the dead/time crysis' style game using spen hover and spen button as shoot
Click to expand...
Click to collapse
what an idea!!!
Hi;
I have an application and I want to use it with KIOSK mode. In iPad, there is Guided Access which is perfect. But for Android, I couldn't fine a good solution. Apps like SureLock are good, but expensive. How can I do that?
burakkilic said:
Hi;
I have an application and I want to use it with KIOSK mode. In iPad, there is Guided Access which is perfect. But for Android, I couldn't fine a good solution. Apps like SureLock are good, but expensive. How can I do that?
Click to expand...
Click to collapse
You could develop your own 'kiosk' application, but that would take a LOT of effort, testing, debugging, etc. You'll also need to have plenty of Android knowledge.
There might be free apps out there but I'm honestly not aware of any out there that do a good job. Android does everything it can to NOT let an application do what kiosk apps are trying to do (Lock a user completely out of the rest of the OS).
Is your application web-based or a standalone Android application? Eitherway, another app called KioWare for Android (You can search for it on the Play Store) might be something you would be interested in. KioWare is a browser application which simply locks down Android to only display the browser. KioWare also has a "Single App MOde" which simply keeps another Android app on top.
I'm not sure if KioWare for Android is more or less expensive than SureLock but I can assure you our support is fantastic. If you ever run into any issues or have questions about your setup you are more than welcome to contact our support team.
Full disclosure: I work for KioWare and am one of the main developers of KioWare for Android.
One idea i had: you could make the app a launcher(need to put Intent.ACTION_LAUNCHER or something similiar into the intent filter).
Gesendet von meinem SM-N9005 mit Tapatalk
For Android devices, Single Application Mode in ]SureLock is the equivalent to iOS Guided Access Mode. It locks your Android device to just one application, hides bottom navigation bar, prevents access to device settings and notifications and also disables hardware buttons. You can use this feature in its trial version for as long as you want.
Hi there,
For now, Android Wear only supports swipe gestures on the touchscreen.
It would be awesome to be able to - for example - turn your wrist to trigger actions.
Does anyone have any information to share regarding gesture detection using wrist movements with Android Wear?
If official support won't be available anytime soon, do you think this can be enabled via custom ROMs?
How hard do you think it would be to implement gesture detection from the moment a custom ROM for a certain smartwatch is available?
Considering Android Wear is... well, Android, and Android has all the API functions to get data from the accelerometer (And lots of helpful development sources), I assume it wouldn't be too hard?
Or am I overlooking something?
Best regards
Use asus remote camera app for android wear. If you twist your watch it will take a picture with your smartphone camera!
greenbat said:
Use asus remote camera app for android wear. If you twist your watch it will take a picture with your smartphone camera!
Click to expand...
Click to collapse
Thanks for the fast reply!
So wrist gesture detection is perfectly possible it seems?
But can you or anyone else give me some additional background about Android Wear:
Should I see Android Wear as a simple extension of the Android API? By which I mean: can I utilise all the Android API functions related to interpreting smartphone/tablet sensor data, provided the smartwatch has that sensor? (e.g accelerometer)
Since I was looking in this particular section related to Wearables (Remove the spaces: developer . android . com/training/building-wearables.html ) and I couldn't find any information about wrist gesture detection.
Is that simply because everything else from the Android API is automatically also applicable to smartwatch development?
(As one can tell, I'm quite new to mobile development.)
https://play.google.com/store/apps/details?id=com.kiwiwearables.app&hl=sv
ezgenesis said:
play.google.com /store/apps/details?id=com.kiwiwearables.app
Click to expand...
Click to collapse
Thanks, will check this out!
Though from looking at the reviews, it seems there is still a lot to improve.
Also, I found the answer to my question in the meantime. In short: yes, you can access the accelerometer on your Android Wear smartwatch using the API functions originally intended for the accelerometer on Android smartphones.
Source: (remove the spaces) stackoverflow . com/questions/26473621/android-wear-accelerometer-gyro-sensor
The Bing app uses a wrist twist to activate
Should be possible soon!
Asgaro said:
Hi there,
For now, Android Wear only supports swipe gestures on the touchscreen.
It would be awesome to be able to - for example - turn your wrist to trigger actions.
Does anyone have any information to share regarding gesture detection using wrist movements with Android Wear?
If official support won't be available anytime soon, do you think this can be enabled via custom ROMs?
How hard do you think it would be to implement gesture detection from the moment a custom ROM for a certain smartwatch is available?
Considering Android Wear is... well, Android, and Android has all the API functions to get data from the accelerometer (And lots of helpful development sources), I assume it wouldn't be too hard?
Or am I overlooking something?
Best regards
Click to expand...
Click to collapse
Android wear is a new platform and is a hell lot fun to develop though when it packs so good specs then definetly some gestures may come. Even the 2015 launch of the apple watch doesnt seem to deliver a perfect gesture. Leaving the digital crown and the electrode on screen taps but i used the asus remote camera app its a lot easier and upholding the wrist wakes the watch u shuld give it a try!
Asgaro said:
It would be awesome to be able to - for example - turn your wrist to trigger actions.
Does anyone have any information to share regarding gesture detection using wrist movements with Android Wear?
If official support won't be available anytime soon, do you think this can be enabled via custom ROMs?
Click to expand...
Click to collapse
Hmmmm.... I just created an AccessibilityService that does exactly what you are describing: it gives you the ability to completely control your smartwatch using wrist gestures.
I created it for my own use (I can only use one of my hands, the one in which I wear the watch) and I can fully control it using wrist gestures.
description: https://jsalatas.ictpro.gr/handsfre...-input-method-for-android-based-smartwatches/
source code: https://github.com/jsalatas/HandsFreeWear
playstore: https://play.google.com/store/apps/details?id=gr.ictpro.jsalatas.handsfreewear
notice that in the current phase I'm not sure if the model that recognizes the wrist gestures can generalize or if it just recognizes my wrist gestures only.
Warning this is a long post grab some tea or coffee.
Coming from a Note 2 the IR capabilities on a phone was a new concept for me. I played around with performing some smart control with peel and a few other apps. Nothing really fit like a professional Universal Remote branded custom programmed remote. Much like a very advanced version of Logitech Harmony. I wanted the WAF (Wife Approval Factor) . This means multiple rooms, devices, scenarios with single button clicks with a remote that changes given its room location and control type. Out Note Edge if you didn't know has an amazing wide spectrum infrared transmitter that can work with a mind blowing array of proprietary devices. Like Xbox and MCE machines to name a few. Its simply needs a powerful app to drive it with a deep code library.
So I stumbled on the app in the play store called "Smart IR Remote - Anymote" . Before I go further I'll point out that I'm not affiliated with any developer and have no benefit to point this app out. I'm just a gadget junkie and a fan of great tech but I digress. This app is not on the cheep side. Its around $6 if I recall. What turned me on to it was its continuous and constant development record, the developer is interacting with users regularly and it met my minimum requirents for experiments.
Setup can be as simple as a Logitech on the basic side or complex as the professional grade. It is your choice . What struck me with aww was the adaptability to create custom remotes WITH full macro or "scene" in the pro world and integrate that in to a one button. Take multiple remotes and combine buttons from all of them to one uber remote. Control WiFi devises along with it. Yup multiple pathway control. Holy cow I was so excited I finally found something to rival some of the most expensive remotes I have ever owned for $6 bucks. It is even equipped with thermostat library's along with some home control capacity. So my brain was tingling with logic layouts and control schemes.
After some serious testing in my house im happy to report only one obscure devise did not have a library entry for its IR codes. One email to the Dev and he was on it. I have set up multiple rooms with a minimum of 3 scenes per room via macro and custom remote setup. My setup works like this . the default remote is the "Home remote" that has buttons labeled per room. Select the room you want to control, Master Bedroom for example and the Master bedroom remote pops up. Then just select one of the action options like "Chrome Cast" turns the TV on pauses and switches the input to the proper location. Then a Chromecast remote pops up with volume and other actions like "Watch Movie". This turns on my HTPC that basicly cone out of sleep, flips the input on the TV to whatever its on and brings up the HTPC remote with all the functions for navigation select and volume. Cool part on this particular scene is if you change it to something else you simply add the sleep option in the macro and the HTPC will go to sleep if you want to change to something else.
It can get far more complex but I used this on my wife's phone that has IR and have had successful WAF !
So play have fun with the infrared on your phone. If anyone is interested I'd be happy to post some examples both simple and complex. Hope this helps someone else and be sure to give a thanks if it did . Cheers.
Reserved
Nice review just some questions, can you use the edge display? It s compatible with logitech squeezebox e logitech link in wifi?
Thank you
Inviato dal mio SM-N910F utilizzando Tapatalk
danilos2k said:
Nice review just some questions, can you use the edge display? It s compatible with logitech squeezebox e logitech link in wifi?
Thank you
Inviato dal mio SM-N910F utilizzando Tapatalk
Click to expand...
Click to collapse
The edge display is not effected. Works just like with any other app. You can add a app icon in to the edge display as a shortcut. Not app specific but handy for any apps you use regularly.
As for question 2 the short answer is no. I had to lookup the Logitech Squeezebox since I don't have one. It looks like they don't have a library entry for it but I'm sending a message to the Dev team and inviting them to this forum.
Great post! I too tend to find the IR capabilitied that Samsung phones have a very cool feature. It amazes me that so many people dont even take advantage of it, but then again there are so many people that dont tske advantage or know how to use 90% of the features on their phones. I sold my friend a nice Sony LED TV a couple weeks ago and after a few hours he called me asking for the remote because the buttons on the side were a pain to deal with and he didnt have a universal remote.. I was blown away by this because not only had i sold him my Galaxy S5 over a month prior to the TV, he had the Galaxy s4 for probably a year befote the 5, and when i told him to just use his phone its better anyway he had no clue what i was talking about... Anyway, sorry to ramble. Lol Thanks again for the info.
thank you
That's what keeps us, the developers, going. Seeing appreciation like yours makes me want to improve the product, makes me ashamed of the occasional bugs you probably encountered, but all things added, leads to a better product.
By the way, @Cab121, make sure to make a backup of your remotes in Settings -> Backup / Restore. You seem to have put a lot of work into yourself, and only I know how many customers relied on (eventually broken) Helium backups when switching roms, cleared the app data accidentally or not knowing what they were doing, losing precious of their work.
As for some of the other questions in the thread:
- using the Edge - No, right now Smart IR Remote doesn't make use of the Edge feature. It's certainly something we want to add, but we have so much stuff on our platter right now that it might take a while to get on this. Have you seen AnyMote Home ?
- Logitech Squeezebox - yes, that's supported over InfraRed, in our Game / Media Manager category
- Logitech Link - sorry, access to that is closed. We've tried, but no luck.
By the way, if anyone has any support questions (missing devices, bugs, stuff like that..) please email [email protected] - that way you have the highest (guaranteed) chances of getting a reply.
Source: I'm one of the Smart IR Remote - AnyMote devs.
SPEN on a none touchwiz ROM
The most compelling reason for buying an SPen Samsung device to me was the vision and possibility for a paper-free lifestyle.
Moving seamlessly between the paper-free and paper-bound world's is a must along that journey.
You can just about see/feel/ makeout the possibility of everything coming together in a best of all worlds type setup.....
So here it is - a capable device that has all the firepower but none of the vision by its manufacturer oh Well...
I bought a Samsung Note 2014 wifi only device with spen in november 2014.
Don't like touchwiz for its bloat and its gimmicky "locked data" feeling.
Installed Cyanogenmod 12 and then Temasek ROM port.
Please share your current setup.
Specific apps you use, relevant settings and how smooth your setup integrates( screenshot examples etc)
The more elegant and evolved your setup the better!!!
Please use the feature request section. See my "Google's Keep app
Sketch for keep
+
Handraw
Option 2 : Note buddy(Free)
OR Note buddy(Paid)
+ Papyrus app
Option 3 : Fiinote - Google Play Store
Option 4 : SpenCommander ( beta testing stage and paid ATM)[RESIZE=100]
[/RESIZE]
Option 5 : Dionote
This app utilizes the SPen pressure sensitive abilities(Not many apps do)! Well worth of further studying.
Other decent apps that you should be aware of are
Lecture Notes (bought the pro but haven't used it much)
OneNote ( feeling timid to rely on a Microsoft app in the Android ecosystem. Thats because of the politics and conflict of interest that underline Microsofts potential for success on an Androids OS. May be an overreaction or far-sightness/short-sightness blunder on my behalf ... but hey we all live and learn )
XDA:DevDB Information
SPen (touchwiz free) CM12(Lollipop), ROM for the Samsung Galaxy Note 10.1 (2014 Edition)
Contributors
xda_nikita
ROM OS Version: 5.0.x Lollipop
Version Information
Status: Testing
Created 2014-12-12
Last Updated 2014-12-11
Ok here is my setup :
CM 12 ( credit goes to RaymanFX and all contributors) with "google keep" , "handraw" and "sketch for keep" . Launcher(side) with "glovebox"
This gives a sort of "action memo" knockoff.
Need help with :
-SPen driven Screenshot select similar to touchwiz spen screengrab app(lassoo select a portion of the screen and have that annotated in an app like handraw before pasted into google keep).
- browser side copy image for images that get pasted into google keep note as you browse ect(much like the pinterest button on a desktop browser pins the image directly to a board)
-browser tabs session save(which ever browser is best for this kind of thing) on a google drive account with corresponding "google keep note" link inside to restore the saved session would be phenomenal!
- spen touch depth sensors in handraw(or alternative setup)
Any way for "handwritting recognition" in any input box field to be used as a system-wide keyboard input method??
Any ideas?
For apps I use Write Beta. Just get through your prefered paper setting and you will be good to go. Maybe it doesnt have the best UI but it compensates by having the best stroke smothing technology.
To make notes, as many of you above I use Keep.
I am trying to be a developer and i made this app which any of you might find usefull
https://play.google.com/store/apps/details?id=com.BCorp.SPasteAnywhere
But i have to say that i abandoned it because for it to work i need SPenSDK which suck, and because it has some error with touchwiz. Happily i dont have touchwiz anymore and so sadly i havent been able to fix it, it should be fine if anyone want to try it on cm12 with some video glitches that currently bug Rayman's great rom.
The point is, If anyone can find a better handwriting library that doesn't depends on a third party app please tell me and lets make great apps together. I was actually working on an action menu replacement for cm12 but see the information i give below.
To do some great stuff for the tablet on CM we will need the functionality that only this app is giving:
https://play.google.com/store/apps/details?id=com.tushar.cmspen
without that app we can not detect pen dettachment, pen hover button, etc(No gestures like the one getting you to action memo or action menu). There are simply no broadcast of this actions in the system. The free version, I think, also has limited functions so if anyone does a cool app with anything else they would require the user to buy this app; and for it to have a decent pen stoke detection the user would have to add the spensdk from the play store. Can you see the hassle here? I asked @RaymanFX to look into it and add the CM SPen companion principle into cm12 for developers to have a better time doing something.
I just thought i'd let you all know before you get too far ahead of yourselves but the spen isnt actually functioning as an spen and doesnt get recognized as one so therefore the spen sdk wouldn't work witn cm12 due to as one of the bugs described says that "e-pen gets recognized as pointer device" and i'd also like to mention that i don't actually see any development in this post itself and should have been put in the general section rather than the original development or at least be a comment on the cm12 thread itself due to it being related to that
dc959 said:
I just thought i'd let you all know before you get too far ahead of yourselves but the spen isnt actually functioning as an spen and doesnt get recognized as one so therefore the spen sdk wouldn't work witn cm12 due to as one of the bugs described says that "e-pen gets recognized as pointer device" and i'd also like to mention that i don't actually see any development in this post itself and should have been put in the general section rather than the original development or at least be a comment on the cm12 thread itself due to it being related to that
Click to expand...
Click to collapse
Even if the rom detects it as a pointer device it is still detected as an spen withing the kernel. SPenSDK detects its as so and functions properly. If you want proof, use my app, SPaste, and see that spen makes strokes and finger is an eraser. This doesnt happens with any other pointer device like a mouse.
Still, as i said, the spen actions(Hover+button, attach, dettach, etc) are not broadcast to the system(Even if the actions are detected on the kernel) and thus, without the cm12 companion app, these functions are unusable. To add something, API to use them are also on the SPenSDK, but because there are no broadcast, this functions still dont work with the sdk installed.
You are completely right about this been in general.
Pazzu510 said:
Even if the rom detects it as a pointer device it is still detected as an spen withing the kernel. SPenSDK detects its as so and functions properly. If you want proof, use my app, SPaste, and see that spen makes strokes and finger is an eraser. This doesnt happens with any other pointer device like a mouse.
Still, as i said, the spen actions(Hover+button, attach, dettach, etc) are not broadcast to the system(Even if the actions are detected on the kernel) and thus, without the cm12 companion app, these functions are unusable. To add something, API to use them are also on the SPenSDK, but because there are no broadcast, this functions still dont work with the sdk installed.
You are completely right about this been in general.
Click to expand...
Click to collapse
That is where you're wrong, a large majority of applications test to see if the s pen is actually available on the device to make sure there is compatibility yet none of them can recognize the device has an s pen because it isn't giving off the correct aigniture check. For instants give GMD spend control or cm s pen add-on a go and you'll see yet the apps are compatible with the device they don't actually get the correct response that the device has an spen because it's failing a aigniture check, I've even had contact with the developer to make sure that it isn't an issue with the app and cm12 and he sent me a debugging enabled version of the add-on app and it was something with the ROM itself preventing the app from verifying that an s pen is present. Sure a application can read the pressure strokes as I've tested it with sketchbook but the issue itself lies within the ROM having the correct drivers to send out the correct signal that it is in fact an s pen which it isn't because for starters the mouse icon is showing up and due to the s pen SDK it actually uses its own set of files to utilize the hardware rather than the Roms files that utilize it
Pazzu510 said:
For apps I use Write Beta. Just get through your prefered paper setting and you will be good to go. Maybe it doesnt have the best UI but it compensates by having the best stroke smothing technology.
To make notes, as many of you above I use Keep.
I am trying to be a developer and i made this app which any of you might find usefull
https://play.google.com/store/apps/details?id=com.BCorp.SPasteAnywhere
But i have to say that i abandoned it because for it to work i need SPenSDK which suck, and because it has some error with touchwiz. Happily i dont have touchwiz anymore and so sadly i havent been able to fix it, it should be fine if anyone want to try it on cm12 with some video glitches that currently bug Rayman's great rom.
The point is, If anyone can find a better handwriting library that doesn't depends on a third party app please tell me and lets make great apps together. I was actually working on an action menu replacement for cm12 but see the information i give below.
To do some great stuff for the tablet on CM we will need the functionality that only this app is giving:
https://play.google.com/store/apps/details?id=com.tushar.cmspen
without that app we can not detect pen dettachment, pen hover button, etc(No gestures like the one getting you to action memo or action menu). There are simply no broadcast of this actions in the system. The free version, I think, also has limited functions so if anyone does a cool app with anything else they would require the user to buy this app; and for it to have a decent pen stoke detection the user would have to add the spensdk from the play store. Can you see the hassle here? I asked @RaymanFX to look into it and add the CM SPen companion principle into cm12 for developers to have a better time doing something.
Click to expand...
Click to collapse
dc959 said:
I just thought i'd let you all know before you get too far ahead of yourselves but the spen isnt actually functioning as an spen and doesnt get recognized as one so therefore the spen sdk wouldn't work witn cm12 due to as one of the bugs described says that "e-pen gets recognized as pointer device" and i'd also like to mention that i don't actually see any development in this post itself and should have been put in the general section rather than the original development or at least be a comment on the cm12 thread itself due to it being related to that
Click to expand...
Click to collapse
I really like "Write Beta" app. Loved the undo and lasso features.
"Fiinote" is worth checking and is definately the best snote replacement app I am aware of.
CM12 ROM is being worked on by RaymanFX. As a developer he is focused on delivering the Android OS.
SPen relies on RaymanFXs efforts to open spen to app developers.
You still need good apps and user friendly functionality while all the developers work out the OS segment.
To avoid polluting RaymanFXs thread with discussions outside of the OS delivery I've made this thread...
Its meant to give a glancing reference on ways of setting up SPen functionality after you are done flashing CM12( or any other Lollipop based ROM )
The Gold standard would be for all SPen functionality to be exposed on OS level , app developer level passed on to app user level. Only then can the spen be freed from its touchwizz chained existence!:crying:
Android stock keyboard is really good!
*We have decent size keyboard
*We have the accessible but not aggressive voice typing option
( Being a "keyboard and trackpad only" user until recently, I can feel ,"google voice" being currently under leveraged by me )
The only let down when using Spen able device is hand writing input option on the stock keyboard.
"Hand writing recognition" should be available anywhere inside Android OS where keyboard input is possible.
Here is an example mockup Android keyboard that should exist (refer to image)
That Handwriting should be just as understated and persistently available like the "voice typing" icon on stock keyboard today.
I would like to put forward the following questions to the wider community, especially active developers .
How do we port/import a handwriting recognition feature into an Android keyboard?
What are the best hand recognition apps and projects that lend themselves to further improvement or integration?
Is it very difficult to modify stock Android keyboard?
What are the essential pieces that need to come together to make this happen??
Hope this makes sense on one level or another.
What about myscript stylus beta?
{Diemex} said:
What about myscript stylus beta?
Click to expand...
Click to collapse
Yep, it gives the hand writing recognition option to any input field and covers all SPen keyboard points I was after.
Thank you!
{Diemex} said:
What about myscript stylus beta?
Click to expand...
Click to collapse
Another "beef" I got with SPen at the moment is Screenshots.
Taking screenshot segments with SPen able device is a frustrating experience on a Touchwiz free ROM like CM12 at the moment.
Its no fault of the ROM builds themselve.
In fact it is truly remarkable how stable CM12 (Lolliopop) runs and just how fast
RaymanFX's effort is paying dividends to P600 owners(Samsung Note 2014 wifi)!
I wasn't able to assemble a draft paper/document with images as I browse thumbnails and articles without leaving the browser screen....
No way to select multiple areas of the screen and have them imported/clipboard copied as individual selections(check mockup imagery)
OK we have no SPen pressure detection at the moment and SPen is just a pointing device to a toucvhwiz free ROM , fine. All should be sufficient to achieve
SPen Screenshot integration like I outlined in the mockup images.
Please excuse my ignorance and share everything that can be done to improve the SPen screenshot situation!!!
I started trying to get a simple screencapper to work similiar to the one included in touchwiz roms.
What is working:
Taking screenshots - root required
Interception of taps
Selection of the crop area above other apps
Cropping of the screenshots to the rectangular selection
What still needs to be fixed/improved:
Atm you start the screencapper it will intercept all touch events. You basically can't use any other apps and have to somehow kill it. I still have to figure out how to toggle interception on/off. Pressing the spen button while it is touching the screen should toggle interception needed for the selection of the area for the screenshot. Releasing the button should return to a state where other apps can be used again.
Pressure and the button state seems to be working correctly. The mouse pointer is just visual, stylus functionality is working correctly.
I have very little time, I have to study for exams. Nevertheless I'm interested in getting a functional app at some stage.
{Diemex} said:
I started trying to get a simple screencapper to work similiar to the one included in touchwiz roms.
What is working:
Taking screenshots - root required
Interception of taps
Selection of the crop area above other apps
Cropping of the screenshots to the rectangular selection
What still needs to be fixed/improved:
Atm you start the screencapper it will intercept all touch events. You basically can't use any other apps and have to somehow kill it. I still have to figure out how to toggle interception on/off. Pressing the spen button while it is touching the screen should toggle interception needed for the selection of the area for the screenshot. Releasing the button should return to a state where other apps can be used again.
Pressure and the button state seems to be working correctly. The mouse pointer is just visual, stylus functionality is working correctly.
I have very little time, I have to study for exams. Nevertheless I'm interested in getting a functional app at some stage.
Click to expand...
Click to collapse
What app are you using to achieve that?
There is a Mac app Screencapper but nothing on Androd Google play with that name.
I tried using a dozen+ screenshot apps with variable degrees of success. None of them gets me close to a touchwiz ROM like screencapture setup.
Its interesting that the SPen button and stroke pressure work for you.
A few posters on the thread suggested that SPen functionality like stroke pressure and button are not working in Lollipop based roms at the moment.
Are you using a KitKat based rom with CM spen app and the SPen SDK for that functionality?
I wrote the app myself. There are some screenshot apps but I couldnt find one that was designed for spen. Maybe there is one that allows cropping of the screenshot afterwards.
I tested stylus input with cm 12, you can try it yourself. Link is in my signature.
xda_nikita said:
What app are you using to achieve that?
There is a Mac app Screencapper but nothing on Androd Google play with that name.
I tried using a dozen+ screenshot apps with variable degrees of success. None of them gets me close to a touchwiz ROM like screencapture setup.
Its interesting that the SPen button and stroke pressure work for you.
A few posters on the thread suggested that SPen functionality like stroke pressure and button are not working in Lollipop based roms at the moment.
Are you using a KitKat based rom with CM spen app and the SPen SDK for that functionality?
Click to expand...
Click to collapse
Oh, sorry if i didnt explained myself correctly. First of all, pressure, hover and button click work correctly on cyanogenmod, that information is correctly taken by the kernel and given to the system. If an app can work with that information through Samsung or other libraries their app will work as intended just like papyrus, write, wacom, etc.
What does not work, or to express it correctly, is not implemented yet, is the broadcast of some information taken correctly by the kernel that should be sent to the system as broadcasts. These information are the gestures we all know like pen attachment dettachment, pen button+hover, pen double tap, and these broadcasts are the ones that would permit the developer to imitate some of touchwiz main features.
A very silly example of what broadcast like these would create:
hey, the user just double tapped the screen with his SPen, is there any app that wants to open itself when this happens? perhaps you SuperDuperNote app? maybe you TotalyLegitActionMemo app?
Click to expand...
Click to collapse
So there are no broadcasts and we developers have no easy way to act to these gestures system wide, anywhere on your screen and on any app that is opened at the moment just like samsung does with ActionMemo on "button+doubletap", or ActionMenu on "button+Hover".
With this in mind the developer would have to do one of 3 things.
Give up and do apps that only react to EPen information(Pressure, hover, button click) through Samsung or Other SDK(Wacom, Papyrus, Write, Spaste)
Intercept the Spen events with root and be a very skillfull programer to something like our friend @{Diemex} is doing.
Install CM SPen companion app and act to the broadcasts like we should be able to, but be dependent on those free app functions or whether the user want to buy the pro version of the same.
Thank you Pazzu510.
You have covered all angles. The explanation is crystal clear.
I've sent an email to the CM SPen Addon developer to see what his frame of mind is like regarding CM12.
His app had the attach/detached covered ok in CM 11 so maybe he has a good angle on getting the functionality registered in CM12.
xda_nikita said:
I've sent an email to the CM SPen Addon developer to see what his frame of mind is like regarding CM12.
His app had the attach/detached covered ok in CM 11 so maybe he has a good angle on getting the functionality registered in CM12.
Click to expand...
Click to collapse
He has explained that it would be rather easy if these broadcast code would be implemented directly into the rom. You can see the post here, it is a good read.
With the understanding of that, and too have a real impact on how broad the apps based on this can be, the important thing would be to make this changes and commit them on cm code. That way, any device with an spen could install cyanogen mod(or any rom based on cyanogenmod) and access apps intended to use this broadcasts. If it is not that way, our apps would be limited to those devices that run only the roms that make the code available, let it be "Cm12 rayman edition" or anything like that. I dont see this as a great idea because the apps made will have little to no compatibility.
What i see as a viable option, until those commits are on cyanogenmod code at least, is to analyze the code behind CM S Pen add-on(Its in github if i recall correctly) and see how they are looking for the kernel event changes. This would work in any device with spen and asking for root in cm is not difficult at all. Any app could start the tracking service for this events and act accordingly that way. My only worry is that if you have, say, 4 or 5 apps with its own service it would probably drain battery life. Another option would be to have a big project with all the functions in one app, but big projects mean few people to make it work and not many have the knowledge, the time or the passion to do such thing; even less when you think that the target market for this project is not big. Sadly, I count myself as a passionate amateur, I dont have the knowledge to do such things.
CM12(RaymanFX) with build 12/15/2014 has now been tested with CM SPen add-on and Notebuddy.
SPen Attach and detach triggers are detected and broadcast with the latest build!
What are the best apps to use with Notebuddy ??
xda_nikita said:
Developers help, enlightenment and info needed to improve Spen integration in CM 12 dramatically!
Click to expand...
Click to collapse
I recently bought this device purely for the spen functionality and hate touchwiz with a passion. I'm a design student and will be using Sketchbook (the paid version) for the majority of the time. All the applications mentioned are focused towards note taking, whereas I'm primarily concerned with correct pressure sensitivity/stoke accuracy.
I'll flash the rom when I get my hands on one, might donate if all goes well!