touchscreen calibration and some other (related?) things - ZenFone 2 General

Backstory (because context is always nice)
Before I got my ZenFone2 I was using a samsung galaxy s2 as my main phone.
It was a really nice phone and became even better after some small modifications such as attaching a bluetooth keyboard to it and converting the touchscreen to a touchpad (like those things on laptops). With a chroot linux installation it actually felt like a pocket computer. However it had rather low processing power and very little ram and a small screen with a low resolution. The zf2 with its good specs and x86_64 would be much better suited to be truly a pc in the pocket....
So I wanted to make the touchscreen on the zf2 behave like a touchpad.
According to this, I only need to change 1-2 lines in an .idc file.
This worked well on the sgs2 (filename:sec_touchscreen.idc) but on the zf2, there is no such file.
After looking through a bit of kernel code, I found out that the file should be called "ftxxxx_ts.idc".
After creating the file you can start to calibrate the behaviour of the touchscreen.
This might be helpful: https://source.android.com/devices/input/touch-devices.html
touchscreen --> touchpad
Here are the basic steps of making the touchscreen behave like a touchpad.
Simply run these commands in a root shell
Code:
echo "touch.deviceType = pointer" >> /system/usr/idc/ftxxxx_ts.idc
echo "touch.gestureMode = spots" >> /system/usr/idc/ftxxxx_ts.idc
echo "touch.orientationAware = 1 " >> /system/usr/idc/ftxxxx_ts.idc
chmod 544 /system/usr/idc/ftxxxx_ts.idc
chown 0:0 /system/usr/idc/ftxxxx_ts.idc
or add these three lines manually to the file (also create if it doesn't exist).
Code:
touch.deviceType = pointer
touch.gestureMode = spots
touch.orientationAware = 1
Make sure the permissions are correct (544 or rwrr).
ALSO IMPORTANT: Make sure that there are no mistakes in the code before running it (because I might have mistyped something....but I don't think so).
(Soft-)Reboot for the changes to take effect.
touchscreen <-- touchpad
To revert the changes, you can delete the file or change
Code:
touch.deviceType = pointer
to
Code:
touch.deviceType = touchscreen
You can also comment out the line.
Code:
#touch.deviceType = pointer
This should make it behave like a touchscreen again.
What next:
I still need to find a way to change between touchscreen and touchpad mode on the fly without doing the editing and rebooting.
Maybe an xposed module could do this, but I have not found one and I have no idea how to write one.
Therefore.....If anyone knows how to do this, please help.

Related

[proof of concept app]Gesture recognition

I recently saw this tread:
http://forum.xda-developers.com/showthread.php?t=370632
i liked the idea, and when i thought about it, gesture recognition didn't seem to hard. And guess what - it really wasn't hard
I made a simple application recognizing gestures defined in an external configuration file. It was supposed to be a gesture launcher, but i didn't find out how to launch an app from a winCE program yet. Also, it turned out to be a bit to slow for that because of the way i draw gesture trails - i'd have to rewrite it almost from scratch to make it really useful and i don't have time for that now.
So i decided to share the idea and the source code, just to demonstrate how easy it is to include gesture recognition in your software.
My demo app is written in C, using XFlib for graphics and compiled using CeGCC so you'll need both of them to compile it (download and install instructions are on the Xflib homepage: www.xflib.net)
The demo program is just an exe file - extract it anywhere on your device, no installation needed. You'll also need to extract the gestureConfig.ini to the root directory of your device, or the program won't run.
Try some of the gestures defined in the ini (like 'M' letter - 8392, a rectangle - 6248, a triangle - 934), or define some of your own to see how the recognition works. Make sure that you have a string consisting of numbers, then space of a tabulator (or more of them) and some text - anything will do, just make sure that there's more than just the numbers in each line. Below you can set the side sensitivity to tweak recognition (see the rest of the post for description on how it works). Better leave the other parameter as it is - seems to work best with this value.
Now, what the demo app does:
It recognizes direction of drawn strokes, and prints them on the bottom of the screen as a string of numbers representing them (described bellow). If a drawn gesture matches one of the patterns in the config file, the entire drawn gesture gets highlited. It works the best with stylus, but is usable with finger as well.
Clicking the large rectangle closes the app.
And how it does it:
The algorithm i used is capable of recognizing strokes drawn in eight directions - horizontally, vertically and diagonally. Directions are described with numbers from 1 to 9, arranged like on a PC numerical keyboard:
Code:
7 8 9
4 6
1 2 3
So a gesture defined in config as 6248 is right-down-left-up - a ractangle.
All that is needed to do the gesture recognition is last few positions of the stylus. In my program i recorded the entire path for drawing if, but used only 5 last positions. The entire trick is to determine which way the stylus is moving, and if it moves one way long enough, store this direction as a stroke.
The easiest way would be to subtract previous stylus position from current one, like
Code:
vectorX=stylusX[i]-stylusX[i-1];
vectorY=stylusY[i]-stylusY[i-1];
[code]
But this method would be highly inaccurate due to niose generated by some digitizers, especially with screen protectors, or when using a finger (try drawing a straight line with your finger in some drawing program)
That's why i decided to calculate an average vector instead:
[code]
averageVectorX=((stylusHistoryX[n]-stylusHistoryX[n-5])+
(stylusHistoryX[n-1]-stylusHistoryX[n-5])
(stylusHistoryX[n-2]-stylusHistoryX[n-5])
(stylusHistoryX[n-3]-stylusHistoryX[n-5])
(stylusHistoryX[n-4]-stylusHistoryX[n-5]))/5;
//Y coordinate is calculated the same way
where stylusHistoryX[n] is the current X position of stylus, and stylusHistoryX[n-1] is the previous position, etc.
Such averaging filters out the noise, without sacrificing too much responsiveness, and uses only a small number of samples. It also has another useful effect - when the stylus changes movement direction, the vector gets shorter.
Now, that we have the direction of motion, we'll have to check how fast the stylus is moving (how long the vector is):
Code:
if(sqrt(averageVectorX*averageVectorX+averageVectorY*averageVectorY)>25)
(...)
If the vector is long enough, we'll have to determine which direction it's facing. Since usually horizontal and vertical lines are easier to draw than diagonal, it's nice to be able to adjust the angle at which the line is considered diagonal or vertical. I used the sideSensitivity parameter for that (can be set in the ini file - range its is from 0 to 100). See the attached image to see how it works.
The green area on the images is the angle where the vector is considered horizontal or vertical. Blue means angles where the vector is considered diagonal. sideSensitivity for those pictures are: left one - 10, middle - 42 (default value, works fine for me ), right - 90. Using o or 100 would mean that horizontal or vertical stroke would be almost impossible to draw.
to make this parameter useful, there are some calculations needed:
Code:
sideSensitivity=tan((sideSensitivity*45/100)*M_PI/180);
First, the range of the parameter is changed from (0-100) to (0-22), meaning angle in degrees of the line dividing right section (green) and top-right (blue). hat angle is then converted to radians, and tangent of this angle (in radians) is being calculated, giving slope of this line.
Having the slope, it's easy to check if the vector is turned sideways or diagonal. here's a part of source code that does the check, it is executed only if the vector is long enough (condition written above):
Code:
if( abs(averageVectorY)<sideSensitivity*abs(averageVectorX) ||
abs(averageVectorX)<sideSensitivity*abs(averageVectorY)) //Vector is turned sideways (horizontal or vertical)
{
/*Now that we know that it's facing sideways, we'll check which side it's actually facing*/
if( abs(averageVectorY)<sideSensitivity*averageVectorX) //Right Gesture
gestureStroke='6'; //storing the direction of vector for later processing
if( abs(averageVectorY)<sideSensitivity*(-averageVectorX)) //Left Gesture
gestureStroke='4';
if( abs(averageVectorX)<sideSensitivity*(averageVectorY)) //Down gesture
gestureStroke='2';
if( abs(averageVectorX)<sideSensitivity*(-averageVectorY)) //Up gesture
gestureStroke='8';
}
else
{ //Vector is diagonal
/*If the vector is not facing isdeways, then it's diagonal. Checking which way it's actually facing
and storing it for later use*/
if(averageVectorX>0 && averageVectorY>0) //Down-Right gesture
gestureStroke='3';
if(averageVectorX>0 && averageVectorY<0) //Up-Right gesture
gestureStroke='9';
if(averageVectorX<0 && averageVectorY>0) //Down-Left gesture
gestureStroke='1';
if(averageVectorX<0 && averageVectorY<0) //Up-Left gesture
gestureStroke='7';
}
Now, we have a character (i used char type, so i can use character strings for string gestures - they can be easily loaded from file and compared with strcmp() ) telling which way the stylus is moving. To avoid errors, we'll have to make sure that the stylus moves in the same direction for a few cycles before storing it as a gesture stroke by increasing a counter as long as it keeps moving in one direction, and resetting it if it changes the direction. If the counter value is bigger than some threshold (pathSensitivity variable is used as this threshold in my program), we can store the gestureStroke value into a string, but only if it's different from previous one - who needs a gesture like "44444" when dragging the stylus left?
After the stylus is released, you'll have to compare generated gesture string to some patterns (eg. loaded from a configuration file), and if it matches, do an appropriate acton.
See the source if you want to see how it can be done, this post already is quite long
If you have any questions, post them and i'll do my best to answer.
Feel free to use this method, parts of, or the entire source in your apps. I'm really looking forward to seeing some gesture-enabled programs
Very nice work. Reading your post was very insightful, and I hope this can provide the basis for some new and exciting apps!
great app... and well done for not just thinking that seems easy... but actually doing it...
ive been a victim of that myself
very nice work man.. one question in which tool did you write code.. i mean it looks like C but how you test and all..
Great app, i see that it is just proof of concept at this stage, but i see that it can be used in future applications control
Continiue with your great work
nik_for_you said:
very nice work man.. one question in which tool did you write code.. i mean it looks like C but how you test and all..
Click to expand...
Click to collapse
It is C, (no "++", no "#", no ".NET", just god old C ) compiled with opensource compiler CeGCC (works under linux, or under windows using cygwin - a unix emulator), developed in opensource IDE Vham (but even a notepad, or better notepad++ would do), tested directly on my Wizard (without emulator). I used XFlib which simplifies graphics and input handling to a level where anyone who ever programed anything at all should be able to handle it - it providea an additional layer between the programmer and the OS. You talk to Xflib, and Xflib talks to the OS. I decided to use this library, because i wanted to try it out anyway.
If i decide to rewrite it and make an actual launcher or anything else out of it, i'll have to use something with a bit faster and more direct screen access (probably SDL, since i already done some programing for desktop PC with it) - XFlib concentrates on usage of sprites - like 2D console games. every single "blob" of the gesture trail is a separate sprite , which has to be drawn each time the screen is refreshed - that is what slows down the app so much. The gesture recognition itself is really fast.
Very good program i just test it and it works very well some combinaison are pretty hard to realize but i like this blue point turning red with command2 and 934. Goond luck , i'll continue to see your job maybe you'll code a very interesting soft.
Interesting work.... would like to see this implemented in an app, could be very useful.
If you want I have some code I did for NDS coding, and which I ported on PocketPC for XFlib.
It works perfectly well and I use it in Skinz Sudoku to recognize the drawn numbers.
The method is pretty simple : when the stylus is pressed, enter the stylus coordinates in a big array. And when it's released, it takes 16 points (could be changed depending on what you need) at the same distance from each other, checks the angle, and gives you the corresponding 'char'.
To add new shapes, it's just a 15 character string which you link to any char (like, link right movement to 'r', or 'a', or a number, or whatever ^^). It works for pretty much any simple shape, and I even used it to do a graffitti-like thing on NDS which worked really well
Hey!
How do you get the last Stylus Positions?
and how often do you read them out
I want to realice such a code under vb.net, but I don't know how i should read out the last stylus positions, to get them perfectly for such calculations
Code:
Private Sub frmGesture_MouseMove(ByVal sender As Object, ByVal e As System.Windows.Forms.MouseEventArgs) Handles MyBase.MouseMove
If StylusJump = 1 Then
StylusJump += 1
If (CurrentStylusPosition.X <> frmGesture.MousePosition.X) Or (CurrentStylusPosition.Y <> frmGesture.MousePosition.Y) Then
LastStylusPosition(9).X = LastStylusPosition(8).X
LastStylusPosition(9).Y = LastStylusPosition(8).Y
LastStylusPosition(8).X = LastStylusPosition(7).X
LastStylusPosition(8).Y = LastStylusPosition(7).Y
LastStylusPosition(7).X = LastStylusPosition(6).X
LastStylusPosition(7).Y = LastStylusPosition(6).Y
LastStylusPosition(6).X = LastStylusPosition(5).X
LastStylusPosition(6).Y = LastStylusPosition(5).Y
LastStylusPosition(5).X = LastStylusPosition(4).X
LastStylusPosition(5).Y = LastStylusPosition(4).Y
LastStylusPosition(4).X = LastStylusPosition(3).X
LastStylusPosition(4).Y = LastStylusPosition(3).Y
LastStylusPosition(3).X = LastStylusPosition(2).X
LastStylusPosition(3).Y = LastStylusPosition(2).Y
LastStylusPosition(2).X = LastStylusPosition(1).X
LastStylusPosition(2).Y = LastStylusPosition(1).Y
LastStylusPosition(1).X = CurrentStylusPosition.X
LastStylusPosition(1).Y = CurrentStylusPosition.Y
CurrentStylusPosition.X = frmGesture.MousePosition.X
CurrentStylusPosition.Y = frmGesture.MousePosition.Y
End If
Dim LabelString As String
Dim iCount As Integer
LabelString = "C(" & CurrentStylusPosition.X & "\" & CurrentStylusPosition.Y & ")"
For iCount = 1 To 9
LabelString = LabelString & " " & iCount & "(" & LastStylusPosition(iCount).X & "\" & LastStylusPosition(iCount).Y & ")"
Next
lblGesture.Text = LabelString
ElseIf StylusJump <= 3 Then
StylusJump += 1
Else
StylusJump = 1
End If
End Sub
Sorry, i didn't notice your post before. I guess that you have the problem solved now, that you released a beta of gesture launcher?
Anyway, you don't really need 10 last positions, in my code i used only 5 for calculations and it still worked fine.
Nice thread, thanks for sharing.
Human-,achine interface has always been an interesting subject to me, and the release of ultimatelaunch has sparked an idea. I am trying to acheive a certain look-and-feel interface - entirely using components that are today screen and ultimatelaunch compatible. Basically a clock strip with a few status buttons at the top, and an ultimatelaunch cube for the main lower portion of the screen - gesture left/right to spin the cube, and each face should have lists of info / icons scrolled by vertical gesture. I'm talking big chunky buttons here - tasks, calendar appts, (quick) contacts, music/video playlists - all vertical lists, one item per row, scrolling in various faces of the cube.
Done the top bit using rlToday for now. Set it to type 5 so scrollbars never show for the top section. All good. Cobbling together bits for the faces, but few of the apps are exactly what I want, some (like that new face contacts one) are pretty close, and being a bit of an armchair coder, I thought now would be a good opportunity to check out WM programming and see if I can't at least come up with a mockup of what I want if not a working app.
I was wondering if anyone could advise me on whether I should bother recognising gestures in such a way as this. Does WM not provide gesture detection for the basic up, down, left, right? Actually, all the stuff I have in mind would just require up/down scrolling of a pane - I was thinking that I may well not need to code gesture support at all, just draw a vertical stack of items, let it overflow to create a scrollbar and then just use the normal WM drag-to-scroll feature (if it exists) to handle the vertical scrolling by gesture in the face of the cube. I would rather keep the requirements to a minimum (eg touchflo), both for dependancy and compatibility reasons, so maybe doing the detection manually would be the "way to go", I dunno.
Did you release source with the app launcher? A library maybe? I intend to go open source with anything I do, so if you are doing the same then I would love to have access to your working code
Nice work man.
Impressive.

[Q] GLSurfaceView and Soft Input

Hi,
I'm writing an open gl game and it requires the user to be able to type input for things such as high scores, saved games and whatnot.
The problem I'm having is that I can't get the soft input keyboard to display with the GLSurfaceView focused for input events.
Is there a trick to it? is it even possible? or do I have to draw my own keyboard with opengl?
I definitely do not want to show a separate activity with android controls because that would look really cheap and subtract from the game experience.
Any help would be appreciated,
Thank you.
Figured it out.
In the GLSurfaceView constructor I needed to set:
Code:
this.setFocusable(true);
this.setFocusableInTouchMode(true);
And then to show the keyboard:
Code:
((InputMethodManager) context.getSystemService(Context.INPUT_METHOD_SERVICE)).showSoftInput(view, 0);
And to hide:
Code:
((InputMethodManager) context.getSystemService(Context.INPUT_METHOD_SERVICE)).hideSoftInputFromWindow(view.getWindowToken(), 0)
Not difficult, but it just isn't documented anywhere, at least not the setFocusable() stuff.

sendevent for input tap x y?

I'm struggling to build up a Tasker routine to automate the USB Audio setup with one screen tap.
To do that I need a method for emulating screen taps (like for "off" and "host" and "close" in the USB Mode Utility, for example). I've already discovered that Eclair lacks the shell command "input tap x y". That would have been the easy route.
I'm not sure about "sendevent" and I'm even less sure about the syntax. Every resource I look at seems written for people who already know the answer. The typical syntax seems to be something like
sendevent /dev/input/event2 x x x
I can see in the root directory that there is a dev folder and in that an input folder. In there are listed 5 things:
21:14 event0
21:14 event1
21:14 event2
21:14 mice
21:14 mouse0
Does anyone know if "sendevent" is a valid shell command in Eclair and, if so, the functions of the events listed above?
Edit: well, I have a partial answer. Running an adb shell getevent I found the following:
event2 = zForce Touchscreen
event1 = gpio-keys [hardware, I assume?)
event0 = TWL4030 Keypad
It said it could not find a driver version for mice or mouse0 because it was not a typewriter (duh).
So it looks like event2 is what I need to deal with. Now if I only understood how. I know I need the screen coordinates where the touch is to be emulated
and I have an app for that.
As much as I love UsbMode, you don't need it.
For a script, you are better off just doing what it does yourself.
Code:
echo host > /sys/devices/platform/musb_hdrc/mode
echo peripheral > /sys/devices/platform/musb_hdrc/mode
echo 0 > /sys/devices/platform/bq24073/force_current
echo 500000 > /sys/devices/platform/bq24073/force_current
Renate NST said:
As much as I love UsbMode, you don't need it.
For a script, you are better off just doing what it does yourself.
Code:
echo host > /sys/devices/platform/musb_hdrc/mode
echo peripheral > /sys/devices/platform/musb_hdrc/mode
echo 0 > /sys/devices/platform/bq24073/force_current
echo 500000 > /sys/devices/platform/bq24073/force_current
Click to expand...
Click to collapse
Wow, and I was so excited because I figured out how to use sendevent w/Tasker to "press" the OFF button in the USB Mode Utility app today!
I really appreciate your response, Renate, so please bear with my lack of Android understanding. I can see that the first line is equivalent to tapping "host" (at least I hope that's what it is...). The the second is how to get back to normal mode.
By extension I am guessing that the third line is equivalent to "off" while the last line is equivalent to "auto". Right so far?
Now the most important question: so is "echo" a shell command I can use? I looked it up and it appears to be, just want to check before I try typing that into Tasker (not that it's half as bad as 8 sendevents to "touch" the screen one time!).
And one last question: is there a similar command equivalent to the "beep" of AudioCTRL (i.e., to kickstart the audio)
Edit: oh, wait, this is it, isn't it: kill -9 19409 [that being the PID of mediaserver on my NST]
Thanks for your help!
Woo-Hoo!!
The shell commands from Renate work great in a simple toggle Task. I just need to work on a few wait times and it's a done deal. One-touch USB Audio!
One question: the command "echo 500000 > /sys/devices/platform/bq24073/force_current" leaves the Max. current setting at 500 mA rather than "Auto". I'm guessing since there is nothing attached to the USB port anyway when you're all done that this is OK?
@nmyshkin
Values are 0, 100000, 500000, 1500000, auto for off, 100mA, 500mA, 1.5A, auto
Renate NST said:
@nmyshkin
Values are 0, 100000, 500000, 1500000, auto for off, 100mA, 500mA, 1.5A, auto
Click to expand...
Click to collapse
Perfect! Thanks so much. I've got a little widget on my homescreen now that does the work behind the scenes! Still struggling with a shortcut that I could customize a little.
I looked at the App Creator for Tasker but see that it requires Android 2.3. I wonder if created apps would therefore be for 2.3 or up? If not, I'd install it on my Nook Tablet running CM 10.2, make an app and export it. That would be cool.
Edit: both completed. Tasker widget here, stand-alone apps here.
Digging up an old thread here, but I'm trying to figure out a way to use an 'input tap' type event for my nook touch. I've got everything set up for a digital picture frame that can dynamically load images but the only slideshow viewer that I found to work doesn't start automatically, it loads on a file location menu first and I need to manually start the slideshow with a button press. Is there an 'input tap' equivalent that will work with the nook?
Figured it out. The adb shell command getevent will return a series of commands when you touch the screen (make sure it is a simple touch and not multiple points). Use these results (converting your numbers from hex to dec) as the command, in my case the correct sequence was:
sendevent /dev/input/event2 3 0 509
sendevent /dev/input/event2 3 1 58
sendevent /dev/input/event2 1 330 1
sendevent /dev/input/event2 0 0 0
sendevent /dev/input/event2 1 330 0
sendevent /dev/input/event2 0 0 0
Obviously it will be different for you, but the general sequence is x coordinate, y coordinate, touch screen, blank, release touch, blank.
And it works (i'm using a series of tasker adb shell commands)!
I don't know which viewer you are using (or even anything about them), but I'll be that it can take a path as data in the actuating Intent.
Then you'd only need something like:
Code:
am start -n com.neatoh.viewer/.Viewer -e Path /MyPhotos
No, these are all hypothetical values.

Possible Fix For Touchscreen Issues/Misses (Updated 08/29)

SEE UPDATE BELOW
After doing some more digging on surfaceflinger, atd, and their related libs, I found some interesting entries in a "strings" analysis of libinputflinger.so. Loads of stuff on touch calibration. I noticed some repeated strings that looked like they're assigned to different properties. You can see this clearly by entering:
Code:
strings /system/lib/libinputflinger.so | grep -iE '(^touch\.|[ ][ ]touch\.)'| sed -e 's/^[ \t]*//' | sort -n | uniq
The terminal returns
Code:
touch.coverage.calibration
touch.coverage.calibration: box
touch.coverage.calibration: none
touch.deviceType
touch.distance.calibration
touch.distance.calibration: none
touch.distance.calibration: scaled
touch.distance.scale
touch.distance.scale: %0.3f
touch.gestureMode
touch.orientation.calibration
touch.orientation.calibration: interpolated
touch.orientation.calibration: none
touch.orientation.calibration: vector
touch.orientationAware
touch.pressure.calibration
touch.pressure.calibration: amplitude
touch.pressure.calibration: none
touch.pressure.calibration: physical
touch.pressure.scale
touch.pressure.scale: %0.3f
touch.size.bias
touch.size.bias: %0.3f
touch.size.calibration
touch.size.calibration: area
touch.size.calibration: box
touch.size.calibration: diameter
touch.size.calibration: geometric
touch.size.calibration: none
touch.size.isSummed
touch.size.isSummed: %s
touch.size.scale
touch.size.scale: %0.3f
touch.wake
I looked up some of strings on the net, and lo and behold, they're build.prop entries! You can see the props above that have different strings to assign to them. The ones with a "%0.3f" refer to a number value, and the one with "%s" is a boolean 0/1.
I've only done a little testing, but I found a baseline of improvement values to make our touch screens more responsive. Some of the properties I couldn't find info on, so I'm testing some values like touch.distance.scale. I feel like I have definitely noticed improvements though. I'm no longer so pissed off using my phone, and the frequency of misses overall seems significantly lower. It's acceptable now. Here's what I'm using now at the end of my build.prop:
Code:
##### touch ######
touch.deviceType=touchScreen
# (geometric, diameter, box, area)
touch.size.calibration=geometric
touch.size.scale=100
# (amplitude, physical, none)
touch.pressure.calibration=amplitude
touch.pressure.scale=0.1
touch.gestureMode=pointer
# (interpolated, vector, none)
touch.orientation.calibration=interpolated
# (box, none)
touch.coverage.calibration=box
For detailed information on these touch properties, read here(search for the property you're interested in; the page is pretty long). Some are self-explanatory and others we'll just need to test more. Check them out and see if any calibration values make a significant change. Just copy the above code and paste to the bottom of /system/build.prop with a nice file manager like Solid Explorer. Warning: Adding these entries in the build.prop will change the default touch properties. You can always change them back to stock by removing or commenting the entries from build.prop. I assume most values are aafe, but I can't be sure.
Also worth noting. I found some additional build.prop values to make Android snappier. The fling/swipe velocity make a big difference. Not sure what the others correlate to.
Code:
##### touch related #####
view.touch_slop=2
view.scroll_friction=1.5
view.minimum_fling_velocity=25
ro.max.fling_velocity=12000
ro.min.fling_velocity=8000
ro.min_pointer_dur=8
windowsmgr.max_events_per_sec=200
EDIT: For detailed information on these touch properties, read here.
I'm gonna add some "profiles" of touch settings to use down here.
This one is for a Nexus 4 I believe. I'm trying it out now, and it seems pretty good. My goal is to emulate the touch experience I had with that phone.
Code:
##### touch ######
touch.deviceType=touchScreen
touch.orientationAware=1
# (geometric, diameter, box, area)
touch.size.calibration=diameter
touch.size.scale=10
touch.size.bias=0
touch.size.isSummed=0
# (amplitude, physical, none)
touch.pressure.calibration=amplitude
touch.pressure.scale=0.005
touch.gestureMode=pointer
# (interpolated, vector, none)
touch.orientation.calibration=none
UPDATE
Hey guys, so here's an update to what I've found out about the touch screen and its issues. I apologize for my low activity on xda. I've been real busy working on some linux projects.
First, in order for the touch.* settings to work, they need to be put in an .idc (input device configuration) file with the name of the device. For the G4, that is: /system/usr/idc/touch_dev.idc.
If you have another phone or want to check, you can get the name of your touch screen device with the terminal command:
Code:
for i in /dev/input/event*; do j="$(getevent -i $i | grep -i touch)"; j=${j#*name: }; [[ -z $j ]] || echo ${j//\"/}; done
Before you go try out the .idc file, I want to warn you that certain settings will disable the touch screen. If this happens, you'll need to use adb to delete or move /system/usr/idc/touch_dev.idc somewhere else so that it doesn't get loaded when the phone boots. These are some settings you must NOT change in the .idc file:
Code:
touch.deviceType = touchScreen
touch.coverage.calibration = box
These are the settings I'm currently using:
Code:
touch.deviceType = touchScreen
touch.orientationAware = 1
touch.size.calibration = diameter
touch.size.scale = 1
touch.size.bias = 0
touch.size.isSummed = 0
touch.pressure.calibration = physical
touch.pressure.scale = 0.001
touch.orientation.calibration = none
touch.distance.calibration = none
touch.distance.scale = 0
touch.coverage.calibration = box
touch.gestureMode = spots
MultitouchSettleInterval = 1ms
MultitouchMinDistance = 1px
TapInterval = 1ms
TapSlop = 1px
I'm not sure if the Multitouch* and Tap* settings work or if adding more values from libinputflinger will work. There's little documentation on using settings that don't begin with "touch." You might have to do some experimentation and try other entries in the "strings /system/lib/libinputflinger.so" readout. I would also try using the first settings I posted if these don't seem to help.
Another thing I found out is that this phone performs better with low entropy. You can monitor your current entropy level in the terminal:
Code:
watch "cat /proc/sys/kernel/random/entropy_avail"
It's usually around 2000+ and peaks at 4096 with high activity which is where I think lag comes in. I found that lowering it to under 1000 average cut out the lag spikes I was getting:
Code:
echo 16 > /proc/sys/kernel/random/read_wakeup_threshold
echo 16 > /proc/sys/kernel/random/write_wakeup_threshold
I went ahead and added that to an init.d script. This doesn't have any side effects I've noticed besides possible increased battery life, since the "hwrng" process that generates entropy has no work to do. In case you don't have init.d, make sure busybox is installed, run this command in the terminal, and you'll have init.d startup:
Code:
mount -o remount,rw /system; echo "sleep 300 && run-parts /system/etc/init.d" >> /system/etc/init.qcom.post_boot.sh; mount -o remount,ro /system
One last thing to mention. The touch device has a little section in sysfs under: /sys/devices/virtual/input/lge_touch. There's some interesting information you can find there, values you can change, and tests you can run. Any file with a name ending in "test" can be run by opening the file, yes sysfs files are weird like this. All tests pass for me except "abs_test":
Code:
cat /sys/devices/virtual/input/lge_touch/abs_test
> ========RESULT=======
> Absolute Sensing Short Test : RESULT: Fail
> Absolute Sensing Open Test : RESULT: Fail
I haven't seen other people with or without touch screen issues run this test, so it may or may not be an indicator that something's wrong with the touch screen or its kernel-side drivers. By the way, this doesn't require superuser. You can check this on any device and even use a good text editor like QuickEdit to open the file and generate test results.
At this point, I'm fairly content with the new improvements I've made, but my best bet on a complete fix would be upgraded touch drivers. The "Advanced In-Cell Touch" device this phone uses is pretty new. There's a good chance this technology has drivers that don't have all the bugs worked out. This is something we'll have to wait on. On the other hand, if LGE handed over a bootloader unlock method and some source files, I'd be just fine with that too.
What "issues" is this attempting to fix
kyle1867 said:
What "issues" is this attempting to fix
Click to expand...
Click to collapse
Probably the horrible touch response many users experience.
Is this something that we can copy and paste into the end of the build prop, or is it replacing stuff that is already there?
Sent from my LG-H811 using XDA Free mobile app
Wow nice job man.
Is it possible to address the swipe registering as taps through this or do you think this will also address it?
Harmtan2 said:
Is this something that we can copy and paste into the end of the build prop, or is it replacing stuff that is already there?
Sent from my LG-H811 using XDA Free mobile app
Click to expand...
Click to collapse
You'll have to add almost all of them.
Yes bro am too facing the touch problem in my intex aqua star power. The problem is when we keep the finger the screen shakes and also in 100% of my usage 20% touch mismatches . On first i irritated and now i habituated with this touch. [emoji28]
Sent from my Aqua Star Power using Tapatalk
The build.prop edits seem to be making the difference. ?
Sent From My LG G4
Rydah805 said:
The build.prop edits seem to be making the difference. ?
Sent From My LG G4
Click to expand...
Click to collapse
would you say that double tap to wake is improved as well?
esmenikmatixx said:
would you say that double tap to wake is improved as well?
Click to expand...
Click to collapse
Hmm, you know what, it is.
Sent From My LG G4
esmenikmatixx said:
would you say that double tap to wake is improved as well?
Click to expand...
Click to collapse
I would say so I have all these except the new ones he posted an an script from another post an I do see some improvements defiantly double tap to wake
GUGUITOMTG4 said:
You'll have to add almost all of them.
Click to expand...
Click to collapse
Soooo... can you run through this with me? I'm not a novice but I'm trying to figure out how to add them? I can't simply text edit the build.prop on my phone or pull/push from my computer?
This post is the reason why I'm glad we now have root.
Akomack said:
Soooo... can you run through this with me? I'm not a novice but I'm trying to figure out how to add them? I can't simply text edit the build.prop on my phone or pull/push from my computer?
Click to expand...
Click to collapse
Yes, you can manually edit it and or push pull it, but sometimes it causes bootloop when edited as a plain text. I would suggest using a build prop editor app from Play Store (I use Build Prop Editor by JRummy. It's Also built in in Rom Toolbox). You will have to copy-paste line by line. I'm gonna try those settings, but in my case, my screen sometimes misses when the phone gets hot. I attribute my touchscreen issues to the Lag LG injected on thermal files.
GUGUITOMTG4 said:
Yes, you can manually edit it and or push pull it, but sometimes it causes bootloop when edited as a plain text. I would suggest using a build prop editor app from Play Store (I use Build Prop Editor by JRummy. It's Also built in in Rom Toolbox). You will have to copy-paste line by line. I'm gonna try those settings, but in my case, my screen sometimes misses when the phone gets hot. I attribute my touchscreen issues to the Lag LG injected on thermal files.
Click to expand...
Click to collapse
Do you have to be rooted to do that?
Hendrycks said:
Do you have to be rooted to do that?
Click to expand...
Click to collapse
Yes you do
Sent from my LG-H811 using Tapatalk
Hi,
yesterday I bought a G4 H815.
I have the following problem: If my phone is on the bed next to me, or lying on a table, the touchscreen response is terrible. If I'm holding it in my hand, there's no problem. If it's charging and so lying on the bed, there's no problem either.
I took a few photos with my Optimus Black, since I could't take any screenshots of the issue:
this is with my phone lying on the bed:
and here holding it in my hands, producing no problems at all.
what is this? It's bloody annonying and totally unacceptable from a phone of this level, And yes... I would use it without holding it, just placing it on my bed next to me, but you can see how it is performing like so...
is my display faulty, or what?
Thanks man.
It's indeed more responsive. Especially double tap to wake is working good now.
*justintime* said:
Thanks man.
It's indeed more responsive. Especially double tap to wake is working good now.
Click to expand...
Click to collapse
I didnt feel a difference can you post a screenshot of your buildprop? Thanks in advance
Maybe im doing it wrong
Sent from my LG-H811 using Tapatalk
Just edit with es file Explorer.
And get the build.prop in the system folder. Not the other one.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

[BASH SCRIPT] Torch fix for pie GSIs

so, I was googling around a bit, a few days ago, and found out you can control the torch (and other leds) from the command line, or a bash script.
Prerequisites
root (magisk or superSU)
FX file manager or Termux
Text editor
Instructions
Create a file in a directory of your choice with a '.sh' extension
Add the following code to the file:
Code:
su -c 'echo 255 > /sys/class/leds/torch/brightness'
Run the shell script in termux (cd to the directory and run it) or run it using FX, it will ask for root access if you haven't already granted it
Your torch is now on!
To turn it off create another file with
Code:
su -c 'echo 0 > /sys/class/leds/torch/brightness'
in it
Tested on an Honor 9 and P10+
If you have dual tone flash, yo may find different values cause either led to come on, for me, '255' is the yellow flash and '1' is the white flash
For a bit more messing around, cd to the '/sys/class/leds' directory, you will see a few more directories for other LEDs on your device, controlling them is exactly the same as the torch!
Works for me, thanks a bunch dude
EDIT: on honor 7x/Huawei mate se
Thank you so much for this but I found in my own experience, and from comments I saw where this has been shared, that a bash script perhaps wasn't the most effective to have to run each time so I thought of implementing this to Tasker, create a "pseudo toggle" and allocate it to a quick setting tile. The algorithm is:
Code:
If flashlightStatus = TRUE then
Run shell command: su -c 'echo 0 /sys/class/leds/torch/brightness'
Set variable flashlightStatus to FALSE
Else
Run shell command: su -c 'echo 3 /sys/class/leds/torch/brightness'
Set variable flashlightStatus to TRUE
End If
I set it up this way because if you've never set the variable before (or if it clears on reboot or something) then it won't equal true and would still enable. Also, when I set the brightness to 255 the torch had a slight yellow tint so upon further reading, despite that 255 should be full brightness, apparently max brightness is 3 and is a white light (this seems to disable automatically after around 750ms however). Hope this will help those who are still without a torch on treble GSIs.
P.S. I also set up a quick profile that if the flashlight status variable = true then to wait 750ms and check if still true then change to false and turn off the torch; this would reset to compensate for the OS turning the torch off automatically but Tasker still thinking it was on. I'm not sure how necessary this is or whether it's overkill or not.
beejkitsune said:
Thank you so much for this but I found in my own experience, and from comments I saw where this has been shared, that a bash script perhaps wasn't the most effective to have to run each time so I thought of implementing this to Tasker, create a "pseudo toggle" and allocate it to a quick setting tile. The algorithm is:
Code:
If flashlightStatus = TRUE then
Run shell command: su -c 'echo 0 /sys/class/leds/torch/brightness'
Set variable flashlightStatus to FALSE
Else
Run shell command: su -c 'echo 3 /sys/class/leds/torch/brightness'
Set variable flashlightStatus to TRUE
End If
I set it up this way because if you've never set the variable before (or if it clears on reboot or something) then it won't equal true and would still enable. Also, when I set the brightness to 255 the torch had a slight yellow tint so upon further reading, despite that 255 should be full brightness, apparently max brightness is 3 and is a white light (this seems to disable automatically after around 750ms however). Hope this will help those who are still without a torch on treble GSIs.
P.S. I also set up a quick profile that if the flashlight status variable = true then to wait 750ms and check if still true then change to false and turn off the torch; this would reset to compensate for the OS turning the torch off automatically but Tasker still thinking it was on. I'm not sure how necessary this is or whether it's overkill or not.
Click to expand...
Click to collapse
mind if i improve this?
Code:
su -c 'if grep -q 1 /sys/class/leds/torch/brightness; then echo 0 > /sys/class/leds/torch/brightness; else echo 1 > /sys/class/leds/torch/brightness; fi'
-- obviously you can change the 1 to whatever you want
that is the code i'm using, prevents the use of an unnecessary variable, so is faster, and will use less resources, plus its pretty much fail safe, since it reads the current state of the torch to determine if its on or off, so if something else sets it to a state it still works, if something else set you variable externally then the torch becomes messed up until a reboot.
plus, im assuming that variable is a tasker thing? not everyone will use tasker, so eliminating the variable all together makes it work on any app that can add custom quick settings tiles
ambitiousButRubbish said:
mind if i improve this?
Code:
su -c 'if grep -q 1 /sys/class/leds/torch/brightness; then echo 0 > /sys/class/leds/torch/brightness; else echo 1 > /sys/class/leds/torch/brightness; fi'
-- obviously you can change the 1 to whatever you want
that is the code i'm using, prevents the use of an unnecessary variable, so is faster, and will use less resources, plus its pretty much fail safe, since it reads the current state of the torch to determine if its on or off, so if something else sets it to a state it still works, if something else set you variable externally then the torch becomes messed up until a reboot.
plus, im assuming that variable is a tasker thing? not everyone will use tasker, so eliminating the variable all together makes it work on any app that can add custom quick settings tiles
Click to expand...
Click to collapse
Yes, the variable is a Tasker thing so I'm glad there is a solution that wouldn't rely on it. Thanks for the upgrade and I've already switched out my Tasker profile for this. Doesn't seem any quicker or anything but more simple!

Categories

Resources