Sorry if I posted this in the wrong place, but here it is. I'm posting this since I didn't see any info about getting this to work.
What you want to do is edit your Xorg.conf to add the new mode, then use lxrandr to set the new mode. You'll probably want to use a custom panel such as fbpanel, not sure how the default dock thing scales.
Here are the changes you want to make in your xorg.conf:
1. Replace:
Code:
# Option "DisplayMask" "TFTLCD,HDMI"
with:
Code:
Option "DisplayMask" "TFTLCD,HDMI"
(it might just say "TFTLCD", if so replace it with "TFTLCD,HDMI")
2. Replace:
Code:
Option "UseEDIDModes" "TFTLCD"
with:
Code:
Option "UseEDIDModes" "TFTLCD, HDMI"
3. In this section:
Code:
...
Section "Device"
Identifier "Tegra Mirror"
Driver "tegra"
...
Replace:
Code:
Option "DisplayMask" "TFTLCD"
with:
Code:
Option "DisplayMask" "TFTLCD,HDMI"
(this step might not be needed)
4. In this section:
Code:
...
Section "Screen"
Identifier "Screen HDMI"
Device "Tegra HDMI"
...
Replace:
Code:
Modes "1280x720"
with:
Code:
Modes "1920x1080"
and in the same section, replace:
Code:
Virtual 1336 1024
with:
Code:
Virtual 1920 1080
After that, you can then do a "killall Xorg" and then run lxrandr and set your resolution to 1920x1080.
(I'm not exactly sure on the steps as I wrote this after doing the mod, so please correct me if I'm wrong on any of them. I have attached the full xorg.conf)
View attachment xorg.conf.zip
eliplan312 said:
I have attached the full xorg.conf)
View attachment 733316
Click to expand...
Click to collapse
I suppose I can just replae the whole file instead of editing it ??
Not to be a party pooper, but im sure many of us would rather see hdmi mirroring than a higher resolution in webtop which uses a lot of resources.
I voided my warranty and your mum.
pukemon said:
Not to be a party pooper, but im sure many of us would rather see hdmi mirroring than a higher resolution in webtop which uses a lot of resources.
I voided my warranty and your mum.
Click to expand...
Click to collapse
that has zero relevance to this topic.
pukemon said:
Not to be a party pooper, but im sure many of us would rather see hdmi mirroring than a higher resolution in webtop which uses a lot of resources.
Click to expand...
Click to collapse
Based on what I've seen it doesn't really use much more resources than usual. The Tegra 2 is pretty capable.
noxlord said:
I suppose I can just replae the whole file instead of editing it ??
Click to expand...
Click to collapse
Yes, you can just replace the entire file.
Gonna try it now... then feedback
In fact, you only need to edit Section "Screen", reboot and xrandr -s 1920x1080
#add to mode you want
Modes "1280x720 1440x900 1920x1080"
# add # in front of virtual
#Virtual 1920 1080
Where is this xorg.conf located? i cant find it in my rom
Hi, I actually posted another thread about how to get the Photon to have 1080p and either/or mirroring and webtop. I was referred to this thread.
So, I did the whole thing of using the edited Atrix xorg.conf file and did a reboot. But I don't understand how/where to do this command: xrandr -s 1920x1080. I have typed (while SU) that in the /etc/X11 directory -- the response is "xrandr: not found". I have done this both while attached to monitor and not.
what is xrandr? is it possible that xrandr works in the Yes, am a newbie.
Also, I am unclear on this below:
#add to mode you want
Modes "1280x720 1440x900 1920x1080"
# add # in front of virtual
#Virtual 1920 1080
Granted this is a photon so it may not work, but worth a try. However, it looks like the xorg.conf file is the same for the Atrix and the Photon.
Anyhow, thanks.
Related
I know some people have already discussed this in the noRefresh App thread.
But I got a new idea
I don't wanna create a module and embed that to applications
Instead, I created a new Input Device in the kernel
After that, I set the device ( dev/input/event3 ) permission to allow read / write for others.
Then, write an app using JNI to catch the Input Device and set the mode of refreshing.
The prototype proved that this idea works, as i worked out it already, but not very perfect, still in experiment
I am just in some problems of threading, as I am not a professional programmer
This is wt i have now:
http://www.youtube.com/watch?v=GkFyvRR6In8
NEW!! : http://www.youtube.com/watch?v=cucG03rg3tg
I know that my method is a bit dirty,
but at least it works XD
I hope that someone would like to help
Upload those code soon
By the way, why can't i change content of init.rc ?
It removes my changes after reboot..... How to solve?
wheilitjohnny said:
By the way, why can't i change content of init.rc ?
It removes my changes after reboot..... How to solve?
Click to expand...
Click to collapse
You need to modify uRamdisk to change init.rc.
Would you like to tell me how to unpack the uRamdisk? I use both Windows and Ubuntu, any methods on these two platform is ok. Thanks!
Try the script from this post http://forum.xda-developers.com/showpost.php?p=24135886&postcount=72
This is simply amazing. Since you already have it working, polishing it shouldn't take too long (I think, I'm still a beginner programmer).
I don't think /dev/input is a good place to park that.
There are any number of things that could show up there and affect order.
Why not put it in its own directory?
Maybe I'm missing something, but if we use your method you still expect applications to have to be modified to read/write from event3 to control display mode?
As it is now, you only need a single function call to switch display modes. Yes, there is a little bit of housework to do before that, but what could be simpler than a single function?
I think no place else is more suitable than /dev/input.
As /dev/input is the linux kernel's input system.
In the driver, I would only use the input report system but not use the I/O system.
Maybe even setup a input/event searching function to solve the problem later.
The case now is that, the touch screen originally just provide event2,
but if we need to extract move and fingerup information on upper level, it may waste some time.
In order to make the whole algorithm easier and faster, I added 1 more event in the zForce driver,
to only output FingerMove and FingerUp state.
As the reporting system is starting from the kernel, this hack would need to change uImage + uRamdisk + Add an App. The project is quite huge.
My final target is to make the application remembering your choice on what mode u need for the focusing application.
AlwaysOn / OnlyWhenDragging / AlwaysOff, so on.
Of coz the click-to-call function still work.
Isn't it a better workflow and more intuitive, isn't it the thing that we would expect?
I already have an dev/input/event3 on my system and sometimes event4, 5, 6.
That's why I don't think that whatever it is you are doing belongs there.
Is the point to allow coders for user applications to interface with your driver?
Is this supposed to work without modifying user applications?
What information would be going in/out of event3?
Clearly I am missing something here.
Actually not necessary to be event3, the system will auto-ly create a new eventx, just in normal case, without any extra USB devices, a nook should only have up to event2. So, I use event3 as an example here.
We can later make the program auto-ly search back which is the needed eventx.
Yes, u r right, we no need to modify any other applications and this is exactly the point y i am creating this!
Anyone know how to call a FullScreen Refresh in a service?
always on?
Look really great. Since blinking after scrolling is incomfortable is it possible to have also "always on " mode using your new ideas?
My final target is to make the application remembering your choice on what mode u need for the focusing application.
AlwaysOn / OnlyWhenDragging / AlwaysOff, so on.
Of coz the click-to-call function still work.
Also an Over-Ride mode, for more flexible using.
But the most basic part need to be handled well now, as it is not very perfect now.
I still don't figure out how to call the e-ink driver to refresh the screen =-=
wheilitjohnny said:
I still don't figure out how to call the e-ink driver to refresh the screen =-=
Click to expand...
Click to collapse
Have you tried
Code:
echo 1 > /sys/class/graphics/fb0/epd_refresh
?
wheilitjohnny said:
Yes, u r right, we no need to modify any other applications and this is exactly the point y i am creating this!
Click to expand...
Click to collapse
Are you saying that your idea does not require modifying user applications?
If it doesn't, then there is no need to have a public interface.
It will be only your code talking to your code.
What is the point of this /dev/input/event3? You say that it will be writable. What's going in and out?
Some apps will be using gestures, some dragging. How are you going to keep track of that all?
I have one application that works perfectly fine now, one activity uses swipe gestures to page up/down while another activity uses drag with a user choice of A2 and display while dragging or else only panning at ACTION_UP.
All this requires less than 10 lines of code.
With multitouch, many applications don't even need A2. Even normal panning in Opera Mobile works much better now that Opera doesn't try to display while panning.
Maybe my english is too bad, cannot express the idea well.
I know, we can make such an application with noRefreshDrag working on its own well.
But how about other applications, it is impossible for us to change all applications.
So, my idea is making it system based.
My prototype is very good now, after several adjustment.
Not limited to only 1 application, but the whole system.
The approach is like this:
1. zForce driver provide extra information to InputEvent
2. A JNI catch the InputEvent
3. A service get the data and set the update mode
We only need to write 1 application to handle the setting of this chain.
This is what i mean, hope that u get what i mean now.
mali100 said:
Have you tried
Code:
echo 1 > /sys/class/graphics/fb0/epd_refresh
?
Click to expand...
Click to collapse
Let me try it now!
wow, it works great
wheilitjohnny said:
zForce driver provide extra information to InputEvent
Click to expand...
Click to collapse
I guess that this is the part that I don't understand.
What is this extra information?
Renate NST said:
I guess that this is the part that I don't understand.
What is this extra information?
Click to expand...
Click to collapse
It created an extra Event, l called it Event3 before.
Driver reports only move and finger_up to Event3.
Just providing a channel to pass an information from driver to user-space.
You may ask why not directly using the existing Event, need to create another one.
That is because, the original one only have touch and position information, parsing them back to move information need a bit of work. As the hardware provided the move information, then don't waste it.
Code:
public boolean dispatchTouchEvent(MotionEvent event)
{
switch(event.getAction())
{
case MotionEvent.ACTION_DOWN:
break;
case MotionEvent.ACTION_MOVE:
break;
case MotionEvent.ACTION_UP:
break;
}
return(true);
}
}
Isn't that everything that you could ask for?
Hello,
I have to do a task, that is more specific than a pure android development. I have to create an android application with a core module written in C or C++ language, doesn't matter, to do some video processing stuff. For example, the scenario is as follows:
1. The Android application captures 10sec video clip from the camera of an Android device;
2. Pass the captured file to the core of the android application;
The core should do the following:
3. Waits for the file, gets the file and opens it;
4. Splits the file into frames - for example, if the camera captures 30fps for 10 seconds video - 300 pictures in png or jpg format;
5. Do some calculations - for example it makes a histogram of a single image and stores it somewhere (db or file, doesn't matter);
6. Returns the result to the Android Code/Java/ which will render it on the default output - e.g. Screen, Console, etc...
How can I do this? I have searched how to split the video file into frames on linux, and I found that it could be done with ffmpeg, but I never dealt with ffmpeg and video/image processing... Could you help me somehow? I don't know what to do and where to start from...
Thanks in advance!
Best Regards,
v4o4ev
Possibly OpenCV / JavaCV
i dont know the exact code but i am sure you can get histograms and stuff with it.
try looking here
http://code.google.com/p/javacv/
Pvy.
pvyParts said:
Possibly OpenCV / JavaCV
i dont know the exact code but i am sure you can get histograms and stuff with it.
try looking here
http://code.google.com/p/javacv/
Pvy.
Click to expand...
Click to collapse
Thank you! The problem now is how to split the video into frames... I am searching about ffmpeg compatibilities, but for now I'm just researching... And one thing - I don't need the code, just the help to find some information about how to do it better and so on thank you so much for the reply - I will take a look now
Thanks!
Best Regards
v4o4ev said:
Thank you! The problem now is how to split the video into frames... I am searching about ffmpeg compatibilities, but for now I'm just researching... And one thing - I don't need the code, just the help to find some information about how to do it better and so on thank you so much for the reply - I will take a look now
Thanks!
Best Regards
Click to expand...
Click to collapse
In java I use the highgui classes, createframegrabber or something like that. I grab each frame and process it for item/motion detection. Then save a movie from it. Works well enough.
Sent from my Galaxy Nexus using xda premium
I was wondering if anyone could point me in the right direction in regards to changing the output HDMI resolution of the Nexus 10, specifically I need the output resolution to be 1280x1024, I dont mind if the built in display needs to be disabled or will look distorted as long as the HDMI output is 1280x1024.
I've considered recompiling the kernel but couldn't see any obvious options within Mali-T6XX to adjust the list of supported resolutions, I also thought of using fbset but with a screen connected /dev/video/ just lists fb0 and no seperate framebuffer for HDMI so Im guessing any changes via fbset would only be applied to the built in LCD and not to the HDMI output.
I dont mind hacking the kernel or the android system but am unsure of where the best place to tackle the problem would be or if I'm barking up the wrong tree and no amount of hacking would be possible.
Any pointers would be immensely helpful
There are several threads where commands/apps/scripts are mentioned that can change the display resolution:
Raw display change commands mentioned here:
Need Tester for possible HDMI resolution workaround
A script that uses the raw commands to toggle resolutions changes ON/OFF:
[How-To] HDMI Fullscreen Toggle Script
A discussion about screen resolution changes with post #7 having a link to an app for changing resolutions:
hdmi resolution changer app?
So far I have not heard of anyone trying a "1280x1024" resolution but with the information above you can surely try that resolution yourself.
http://forum.xda-developers.com/showthread.php?p=41961269#post41961269
lKBZl said:
http://forum.xda-developers.com/showthread.php?p=41961269#post41961269
Click to expand...
Click to collapse
Just beat me too it Although that new app does not seem to have the posters requested resolution of "1280x1024".
'am' now there's a handy command and apologies for my noobishness I should have caught some of those links
Interesting it looks like display-size never actually changes the resolution of the screen, instead it changes the size of a virtual canvas with virtual pixels in memory which are then translated and scaled to the real x, y coordinates of whatever resolution the screen is detected as.
This allows display-size and pixel-density to be set arbitrarily high or low causing a zoom in/out affect but the real output resolution remains constant at 1080p.
Unfortunately the screen i'm using doesn't support 1080p so i'm hoping to find a way to change the real output resolution.
So I have something to offer to you and something to ask in return
I've built Atmel's mxt-app (https://github.com/atmel-maxtouch/obp-utils), and have attached it to this thread. On the Desire Z, it can be used as such, from a root prompt:
Code:
#chmod 755 mxt-app
#./mxt-app -d 0 -a 4a
You won't be able to grant execute permissions on a fat32 partition (for example, your SD card), so copy it to somewhere on an ext partition first (/data/local/tmp works well).
It's probably of questionable utility to most, but certainly interesting to play around with.
Anyway, now my own request: I've installed a replacement digitizer, and the x axis is inverted. My gut says this is a config issue with the screen's firmware, but Atmel have a really weird strategy whereby their driver is completely open source but you need to sign an NDA to get the config specification. So, I would like to grab a working configuration from somewhere and just load the whole thing into mine. This is where you come in! If you are feeling helpful:
Code:
#./mxt-app -d 0 -a 4a
S
output.cfg [i]or whatever you want here[/i]
Q
#cp output.cfg /sdcard
Then upload output.cfg from your SD card to here (or just copy/paste its contents, looks like ASCII encoded octal :/) For bonus points, tell me what ROM you are using, as copying configs between different driver versions may not work so nicely.
Cheers!
Jarrad
OK! So I've had a little success and managed to fix the x-axis on my touchscreen being inverted. This goes for anyone using a phone with the mxt224 touchscreen, though the fix isn't perfect - it needs to be reapplied every boot.
TL;DR:
Code:
#./mxt-app -d 0 -a 4a -T9 -W -r9 -n1 [b]01[/b]
should do it for the DZ/G2.
To be more exact, you need to flip the third bit (bit 2, with the least significant bit on the right being bit 0) of Register 9 of the T9 config block - you can read this register like
Code:
#./mxt-app -d 0 -a 4a -T9 -R -r9 -n1 -f
On my touch screen, this outputs (among other things) a value of
Code:
0000 0[b]1[/b]01
. This equals 05 - flipping the bit makes this value equal to
Code:
0000 0[b]0[/b]01 = 01
. The number at you are using to set needs to be specified with a two-hex-digit byte, so say if you are changing 1100 0111 to 1100 0011, you would need to write C3 as your number. Windows calculator in Programmer mode can help with this. It's also important to write zero as the first digit - the program will interpret 1 as 10! You need to write 01.
It would still be really useful to have an output config of a stock touchscreen posted here, as detailed in the previous post, as there are a few other things that the aftermarket controller does differently to the factory DZ part (reporting rate and edge detection are the things I've noticed).
Akdor 1154 said:
OK! So I've had a little success and managed to fix the x-axis on my touchscreen being inverted. This goes for anyone using a phone with the mxt224 touchscreen, though the fix isn't perfect - it needs to be reapplied every boot.
TL;DR:
Code:
#./mxt-app -d 0 -a 4a -T9 -W -r9 -n1 [b]01[/b]
should do it for the DZ/G2.
To be more exact, you need to flip the third bit (bit 2, with the least significant bit on the right being bit 0) of Register 9 of the T9 config block - you can read this register like
Code:
#./mxt-app -d 0 -a 4a -T9 -R -r9 -n1 -f
On my touch screen, this outputs (among other things) a value of
Code:
0000 0[b]1[/b]01
. This equals 05 - flipping the bit makes this value equal to
Code:
0000 0[b]0[/b]01 = 01
. The number at you are using to set needs to be specified with a two-hex-digit byte, so say if you are changing 1100 0111 to 1100 0011, you would need to write C3 as your number. Windows calculator in Programmer mode can help with this. It's also important to write zero as the first digit - the program will interpret 1 as 10! You need to write 01.
It would still be really useful to have an output config of a stock touchscreen posted here, as detailed in the previous post, as there are a few other things that the aftermarket controller does differently to the factory DZ part (reporting rate and edge detection are the things I've noticed).
Click to expand...
Click to collapse
Hi,
How you figured which bits to flip?
I think I found a list of the different registers somewhere and set about flipping bits in the ones that seemed appropriate, haha. It took a bit of mucking around.
Sent from my XT1053 using XDA Premium 4 mobile app
Akdor 1154 said:
I think I found a list of the different registers somewhere and set about flipping bits in the ones that seemed appropriate, haha. It took a bit of mucking around.
Sent from my XT1053 using XDA Premium 4 mobile app
Click to expand...
Click to collapse
Can you recover that list somehow? It's very needed for some other crappy atmel controller.
Thanks,
Vadim
vadimbrk said:
Can you recover that list somehow? It's very needed for some other crappy atmel controller.
Thanks,
Vadim
Click to expand...
Click to collapse
It's in atmel_mxt_ts.c in the linux kernel from memory.
Akdor 1154 said:
It's in atmel_mxt_ts.c in the linux kernel from memory.
Click to expand...
Click to collapse
Thanks. Have you found a way to make it pernenent?
My device runs windows, in linux I can make it work, as soon as I reboot, ghost touches again...
Nope, I ended up just dropping a script in init.d. On Windows you could just stick a batch file in start-up.
Sent from my XT1053 using XDA Premium 4 mobile app
Akdor 1154 said:
Nope, I ended up just dropping a script in init.d. On Windows you could just stick a batch file in start-up.
Sent from my XT1053 using XDA Premium 4 mobile app
Click to expand...
Click to collapse
To run it on windows it needs to be ported first.
I found a solution to my specific problem, putting the controller in bootloader mode, reboot to windows(touch doesn't work) and disconnecting the controller. Upon connecting it back the device ghost touches disappeared.
Same method of sticking a controller in bootloader mode works on another(android) device with a similar problem to yours. Try it.
vadimbrk said:
To run it on windows it needs to be ported first.
I found a solution to my specific problem, putting the controller in bootloader mode, reboot to windows(touch doesn't work) and disconnecting the controller. Upon connecting it back the device ghost touches disappeared.
Same method of sticking a controller in bootloader mode works on another(android) device with a similar problem to yours. Try it.
Click to expand...
Click to collapse
From the command line that you using its looks like you used an old version of the utility.
Have you tried the --backup parameter? Its save the config to nvram and making the change permanent.
Ah yeah that was a brain fail on the porting, in my defence it was 7am when I replied I haven't needed to play with it much since this thread was posted, I upgraded to a Moto X at the start of the year. I guess if I could make the config permanent I could happily sell my dz though, so thanks for that
Sent from my XT1053 using XDA Premium 4 mobile app
Hi, i also replaced my digitizer with a replacement and the x-axis is flipped.
How can i modify the driver?
mxt-app does not start: error: only position independent executables (PIE) are supported
So one thing I was able to achieve recently was to set up the Nook as a screen mirror in a very simple fashion, thanks to folks at mobileread! I was wondering if anyone else uses their nooks as a screen?
Right now, I have two nooks set up such that each would mirror half of a laptop with the weird 1360x768 resolution. It's been great for reading academic journals because these PDFs are typically double columned.
Could you provide some details how you achieved this? Or any keywords how to find it ?
krzynier said:
Could you provide some details how you achieved this? Or any keywords how to find it ?
Click to expand...
Click to collapse
I would also love an explanation to that picture.
Is the nook getting the images via usb? Or wifi?
Are any special adapters needed?
krzynier said:
Could you provide some details how you achieved this? Or any keywords how to find it ?
Click to expand...
Click to collapse
This was achieved using a simple python script written by kranu @ Mobileread. Here's the link. The original script was only made for a single screen that captured a 800x600 image of your host screen at a 1:1 ratio. If you wanted and knew python, you could easily make it capture a 1600x1200 image and compress it to fit the Nook's 800x600 screen.
shorty66 said:
I would also love an explanation to that picture.
Is the nook getting the images via usb? Or wifi?
Are any special adapters needed?
Click to expand...
Click to collapse
The nook is getting screenshots from the host computer via WiFi. So literally no adapters needed. All you need is a browser on the nook and python installed on the host computer. So yes, this method will technically work from ANY e-ink screen with a browser, but ideally you want a browser that can go into "full screen" mode (no bars). There are few e-readers that have a 800mhz CPU that can be had for $30 or less. Hence why I think the nook is an ideal choice for doing this. You can have as many nooks as you want if you wanted to make a bigger screen.
There are other ways to achieve this, such as running a VNC viewer. But I opted for a simpler approach because I didn't intend to interact too much with the monitor, I simply needed it for reading.
Wow. Very nice.
Just to let people know, another approach is running VNC but the advantage of the screenshot thread here of course is that you can use more than 1 nook.
PoisonWolf said:
This was achieved using a simple python script written by kranu @ Mobileread. The original script was only made for a single screen that captured a 800x600 image of your host screen at a 1:1 ratio. If you wanted and knew python, you could easily make it capture a 1600x1200 image and compress it to fit the Nook's 800x600 screen.
Click to expand...
Click to collapse
Just set the capture region to your monitor resolution
w,h=(1600,900) #width and height of capture region
and modify
image=wx.ImageFromBitmap(bitmap)
data=list(image.GetData())
to
image=wx.ImageFromBitmap(bitmap)
image = image.Scale(600, 800, wx.IMAGE_QUALITY_HIGH)
image = image.Rotate90(True)
data=list(image.GetData())
and then you have the monitor display scaled for nook screen size and rotated to landscape mode.
Big Kudos for kranu for the idea and script and for PoisonWolf to finding it
krzynier said:
Just set the capture region to your monitor resolution
w,h=(1600,900) #width and height of capture region
and modify
image=wx.ImageFromBitmap(bitmap)
data=list(image.GetData())
to
image=wx.ImageFromBitmap(bitmap)
image = image.Scale(600, 800, wx.IMAGE_QUALITY_HIGH)
image = image.Rotate90(True)
data=list(image.GetData())
and then you have the monitor display scaled for nook screen size and rotated to landscape mode.
Big Kudos for kranu for the idea and script and for PoisonWolf to finding it
Click to expand...
Click to collapse
This is pretty awesome. I know little about python scripting. I think I'm going to use this to make a smaller capture region be scaled higher (make things appear larger).