Related
Just installed CM10 on my nexus q, but I would like to use it with a 720p tv we have in one of the bedrooms.. Anyone know how I can change the resultion from the default 1080 ?
Joyrex said:
Just installed CM10 on my nexus q, but I would like to use it with a 720p tv we have in one of the bedrooms.. Anyone know how I can change the resultion from the default 1080 ?
Click to expand...
Click to collapse
My experience is that it will default to whatever the maximum resolution it detects when you turn it on with the HDMI connected.
Are you getting a cropped 1080p desktop on your 720p TV? One of my HDMI ports is busted on my TV and that cropping happened to me when I tried to connect my Q to the TV via an HDMI -> Component converter. Plugging it in direct to HDMI with the TV already on when I booted it fixed it for me
i am having a similar issue, all boxes are checked and it is situation normal, but it cuts off all edges, may swap HDMI ports later
Sorry that here might be a wrong area to post this thread, but as a newby, I don't have right to post in another section...
I am using an LG 1080i (max) LCD TV at home and when I connect my newly et OUYA to it , it shows nothing on the screen.
As far as I know, my LG tv supports 720p also, where all others gaming device of mine (Xbox360/PS3) work flawlessly with it.
I learnt from the support area that OUYA should support 1080P & 720P, and it will detect and change the setting automatically upon HDMI connection. I test my OUYA with another 1080p Sony TV and it works just fine, so it's not my OUYA got damaged.
I've tried the 720p mod too. It shows 720p signal on the sony tv but again no luck on the LG tv.
Can any of the professionals here might give me some hints on solving this so that I can start enjoying my OUYA experience?
There's no such thing as a 1080i LCD. Your TV is faking the 1080i output (or just accepting the input). It probably runs at 1360x768 or whatever that weird resolution is. I would assume your TV simply doesn't accept the true 720p signal, but I could be wrong. If you have an XBox 360 or PS3 (or a laptop) you could try connecting them and setting them to 1280x720 and see if it works.
brandogg said:
There's no such thing as a 1080i LCD. Your TV is faking the 1080i output (or just accepting the input). It probably runs at 1360x768 or whatever that weird resolution is. I would assume your TV simply doesn't accept the true 720p signal, but I could be wrong. If you have an XBox 360 or PS3 (or a laptop) you could try connecting them and setting them to 1280x720 and see if it works.
Click to expand...
Click to collapse
Type to force it to 1080p and hope your TV can cope. If not try to force it to 720p and see what happens. Most tv's that will do 1080i also will do 720p. I know mine does not report 1080p as an accepted mode because it was made before 1080p was on consumer electronics.
You force it via build.prop or the mod collection thing if you are not comfortable editing the build.prop.
Hi Everyone,
I just got a Nexus Player and connect to my TV.
Everything is good except the resolution stays on 720P.
My TV does support 1080P, since all other devices use 1080P resolution.
And Nexus Player should also support 1080P.
Does any one know if there is a way to manually setup the resolution on Nexus Player?
Thanks a lot.
skyshocker said:
Hi Everyone,
I just got a Nexus Player and connect to my TV.
Everything is good except the resolution stays on 720P.
My TV does support 1080P, since all other devices use 1080P resolution.
And Nexus Player should also support 1080P.
Does any one know if there is a way to manually setup the resolution on Nexus Player?
Thanks a lot.
Click to expand...
Click to collapse
The TV screen resolution will still stay at 720p, but the Nexus TV screen resolution should be 1080p ! Their is a way to verify this via the app Screen Specs found at the Google Play Store https://play.google.com/store/apps/details?id=com.MTMDevelopers.ScreenSpecs ! If you cannot get this app let me know. Another way to verify this is via the AnTuTu BenchMark app!
AndroidTVTech1 said:
The TV screen resolution will still stay at 720p, but the Nexus TV screen resolution should be 1080p ! Their is a way to verify this via the app Screen Specs found at the Google Play Store ! If you cannot get this app let me know. Another way to verify this is via the AnTuTu BenchMark app!
Click to expand...
Click to collapse
Hi, Thanks a lot for your reply.
However, if the TV screen shows the signal is 720p, that means nexus player sends 720 picture to the TV via HDMI, as a result, you will see a 720 picture, which I expect to see a 1080P picture.
I have no idea why these app shows 1080p in the system, but as long as it doesn't send 1080p to the TV via HDMI, we can't get 1080 quality, right?
If the TV is 1080, but received a 720 signal, that means it is not pixel to pixel picture, the picture is not clear.
Is there a way to force the player to send 1080P signal via HDMI? Will Root support this?
Thanks a lot.
skyshocker said:
Hi, Thanks a lot for your reply.
However, if the TV screen shows the signal is 720p, that means nexus player sends 720 picture to the TV via HDMI, as a result, you will see a 720 picture, which I expect to see a 1080P picture.
I have no idea why these app shows 1080p in the system, but as long as it doesn't send 1080p to the TV via HDMI, we can't get 1080 quality, right?
If the TV is 1080, but received a 720 signal, that means it is not pixel to pixel picture, the picture is not clear.
Is there a way to force the player to send 1080P signal via HDMI? Will Root support this?
Thanks a lot.
Click to expand...
Click to collapse
Everything you are saying is correct, but the Nexus TV Player will not override the built in 720p TV Screen resolution ... Yes, the Nexus TV Player is sending a 1080p signal to your 720p TV, the 1080p signal is being down graded to 720p ... In order for you to get full 1080p screen resolution from your Nexus TV Player, your TV will have to support 1080p screen resolution.
Now if your TV supports 1080i/1080p and you have a cable service, their is a way to go into the cable box to change the screen resolution. Any other way, it would have to be changed via the TV settings.
AndroidTVTech1 said:
Everything you are saying is correct, but the Nexus TV Player will not override the built in 720p TV Screen resolution ... Yes, the Nexus TV Player is sending a 1080p signal to your 720p TV, the 1080p signal is being down graded to 720p ... In order for you to get full 1080p screen resolution from your Nexus TV Player, your TV will have to support 1080p screen resolution.
Now if your TV supports 1080i/1080p and you have a cable service, their is a way to go into the cable box to change the screen resolution. Any other way, it would have to be changed via the TV settings.
Click to expand...
Click to collapse
Unless I'm mistaken you appear to be misunderstanding the original issue, his TV can support and other inputs work at 1080p with it, but for some reason he suspects that his Nexus is only outputting 720p, at least that is what his TV thinks its getting.
This appears to happen to other people also, without a decent resolution, hoping that M preview 3 will contain something positive.
https://productforums.google.com/forum/#!topic/nexus/7A-snuOEYis
I get the same issues. I purchased two Nexus players today, primarily for running Kodi.
The first of these is connected via a Home Theatre receiver to a 1080p TV and is displaying the Nexus Player content at 1080p.
I have a second home Theatre Receiver connected to a 1080p projector and is using the following resolutions:
[li]720p : Nexus Player running Android 5.1.1
[li]1080p : Apple TV 3
[li]1080p : Sony Bluray Player
I also tried swapping the inputs to the Home Theatre receiver to see if the output resolution changes from the Nexus Player. The Nexus Player is refusing to show the content at 1080p. There is also no way to force the output resolution. It certainly makes the viewing experience terrible for HD content on the Play store or via Kodi.
The Nexus Player and Android TV really needs work to improve settings available for manually configuring the output resolution.
Hi Guys - I'm also stuck at 720p output from my nexus player. Have there been any developments on unlocking 1080p?
Does anybody know if there is a fix to this? I just bought a nexus player and it is outputting 720p to my TV. I've searched at various websites and I couldn't find any solutions to this. Any input is appreciated. Thanks!!
I'm having the opposite issue.. hooked up to a cheap 720p TV and it outputs 1080, which the TV fails to convert properly, so I loose an inch or so of viewable screen on many apps (launcher seems OK?)
..found some posts that suggest "overscan calibrator" can fix this problem:
http://forum.xda-developers.com/attachment.php?attachmentid=3234854&d=1427641822
..It worked for my particular issue.
if someone cant answer this problem. I'm taken it back because the picture is horrible
same issue here ... running N preview
tried many hdmi cables
I can't even SEE a picture, because I cannot reset resolution from 1080 to 720 . And the many many many responses on xda have been deafening on this issue.
Is there ANY way to manually set resolution??????
Does any alternative ROM do this? Does the new developer preview? Can it be done with any 3rd party apps like overscan calibrator? Any sort of editor?
Hi there. I have the Nvidia Shield TV Pro and have been having trouble finding the right receiver.
First I purchased a Sony STRDH550 5.2 Channel 4K AV Receiver, and found out that it cannot handle 4K upscaling (which the Nvidia Shield TV does according to Sony) Technically it worked, but what would happen is the screen goes black then will show the input on tv. Almost like someone unplugs the HDMI and then reconnects. Best way I have thought of naming this is an HDMI Blink. It happens numerous (10 to 15 times) per day. So according to Sony this receiver was only 4K passthrough capable, which actually didn't even work, I had to connect shield directly to TV and use ARC for sound.
Second, I purchased an Onkyo Thank you-NR636 7.2 Ch -- which is advertised as 4K passthrough and 4K upscale capable. I still have the HDMI blink, but now it is not as often (2-3 times) per day.
The TV I'm using is a Vizio M50-C1 and is 4K HDMI 2.0 HDCP 2.2.
It has 4 30Hz HDMI ports and 1 60Hz HDMI port.
The only way I can get ULTRA 4K to play from Netflix is to connect it to the 60Hz port.
The other 30Hz ports are labeled 4K, and when connected will play with audio/video but will not broadcast in 4K. I do not understand this, so if anyone could shed some light that would be great. (I've tried every troubleshooting step I could think of) ie. unplugging all cables and plugging them back in sequence. After connection unplugging the power on the Shield TV and re plug. (Same with receiver and TV) I've tried other sources on the receiver and tried every HDMI port/input.
So I'm thinking of returning this Onkyo Receiver and purchasing a better/more expensive one. I would appreciate anyones feedback on their experience with the Nvidia Shield TV and 4K Receiver/TV.
Has anyone been able to get full 60Hz 4K and 720/1080 Upscaling from Receiver to TV to operate without any problems?
Are there any Receivers you could recommend to me that can handle this properly? Thanks for your help.
I cant answer your question directly as I haven't used 4k receivers, but have you considered that the HDMI cable could be faulty?
Unfortunately I already replaced both the HDMI cables with brand new gold plated HDMI 2.0 4K cables.
I might be able to shed some light.
Firstly it doesn't sound like your HDMI cable. There's actually no such thing as a HDMI 2.0 cable. It's the devices that are HDMI 2.0, and will work with any high speed HDMI cable.
Always plug a 4K device into a 60hz HDMI if possible. That way if any of the apps, or even the home screen want to run at 50hz or 60hz you won't experience a problem. Obviously if it's plugged into the 30hz port, the picture will disappear when the device outputs anything above 30hz.
You don't actually want your AVR to upscale your picture. Your 4K TV will do it automatically, and TV's generally do a better job than AVR's.
As for the HDMI blink, I'm not entirely sure (I've not actually plugged my shield into my 4K TV yet) it could be when the system switches resolutions, eg 4K to 1080p. The screen needs a little time to process the new source. My TV does that when my PC is hooked up, and I switch between a 4K desktop environment and a 1080p game.
If it's any help my AVR has no problem with any 4K material I've thrown at it. I use a Pioneer VSX 930.
The Shield TV, like all modern Nvidia devices, has a built in upscaler. No need for an external one and in fact, extra processing is likely to degrade gaming experience.
Using ARC is actually a good solution although that has the disadvantage of requiring the TV to be on just to listen to music.
martyn3000 said:
I might be able to shed some light.
Firstly it doesn't sound like your HDMI cable. There's actually no such thing as a HDMI 2.0 cable. It's the devices that are HDMI 2.0, and will work with any high speed HDMI cable.
Always plug a 4K device into a 60hz HDMI if possible. That way if any of the apps, or even the home screen want to run at 50hz or 60hz you won't experience a problem. Obviously if it's plugged into the 30hz port, the picture will disappear when the device outputs anything above 30hz.
You don't actually want your AVR to upscale your picture. Your 4K TV will do it automatically, and TV's generally do a better job than AVR's.
As for the HDMI blink, I'm not entirely sure (I've not actually plugged my shield into my 4K TV yet) it could be when the system switches resolutions, eg 4K to 1080p. The screen needs a little time to process the new source. My TV does that when my PC is hooked up, and I switch between a 4K desktop environment and a 1080p game.
If it's any help my AVR has no problem with any 4K material I've thrown at it. I use a Pioneer VSX 930.
Click to expand...
Click to collapse
Thank you for the information. It's actually really helpful. Those were all things I was wondering but not sure. Another question for you; I primarily use the shield remote and not the controller. If I have the shield connected to the receiver with pass through, and I try to adjust the volume with the shield remote it does not work. It says this app is set for surround sound please use the tv remote to adjust the volume. Is this normal? On my previous receiver that didn't happen. I could adjust the volume with the shield remote. Any thoughts?
NiHaoMike said:
The Shield TV, like all modern Nvidia devices, has a built in upscaler. No need for an external one and in fact, extra processing is likely to degrade gaming experience.
Using ARC is actually a good solution although that has the disadvantage of requiring the TV to be on just to listen to music.
Click to expand...
Click to collapse
The arc solution does work. I get 4K video etc. but it still has the HDMI blink when playing non 4K content. The only solution so far I have found us to use the 30hz input on the TV, which eliminates the HDMI blink. Then when I way to watch 4K content, switch it back to the 60hz port.
I haven't tried changing the Nvidia HDMI settings to use the (less than 60hz) option. I think it's 27hz or something like that.
Does the TV work fine with another 4K source like a PC? You might also want to try a shorter and/or different brand of HDMI cable.
I have t tried another source. I'm not even sure my PC does 4K. Would I need a 4K graphics card? I will definitive try a different cable. I had previously used a gold plated cable that was a few years old, and that's when I noticed the HDMI blink. So I purchased two new ones from Amazon. I'm pretty sure they are the right ones, but would like to reassure if anyone knows.
https://www.amazon.com/gp/product/B00NQ9OQU2/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
those should work, but if you are looking for a great AV receiver to use with this, I just bought a yamaha 2050, and everything works great!
http://www.amazon.com/Yamaha-RX-A2050-9-2-Channel-MusicCast-Bluetooth/dp/B00YMN6E7O
mikie00mike said:
Thank you for the information. It's actually really helpful. Those were all things I was wondering but not sure. Another question for you; I primarily use the shield remote and not the controller. If I have the shield connected to the receiver with pass through, and I try to adjust the volume with the shield remote it does not work. It says this app is set for surround sound please use the tv remote to adjust the volume. Is this normal? On my previous receiver that didn't happen. I could adjust the volume with the shield remote. Any thoughts?
Click to expand...
Click to collapse
HDMI control is a fickle beast. I find it works with some devices and not others. It could just be the combination of Shield and AVR you currently have.
I wouldn't be changing your AVR primarily on it's ability to function with your Shield though.
mikie00mike said:
I have t tried another source. I'm not even sure my PC does 4K. Would I need a 4K graphics card? I will definitive try a different cable. I had previously used a gold plated cable that was a few years old, and that's when I noticed the HDMI blink. So I purchased two new ones from Amazon. I'm pretty sure they are the right ones, but would like to reassure if anyone knows.
https://www.amazon.com/gp/product/B00NQ9OQU2/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
Click to expand...
Click to collapse
If you have a 650 or above GPU (Kepler), it supports 4K output.
I have just overcome some 4k UHD issues with my Shield TV which are similar to the ones you described.
The main problem is HDCP 2.2....
This is what I found out.
Some time last year the HDCP standards were changed to require any 4K 60hz display to only accept a 4K 60hz UHD signal if it was wrapped in HDCP 2.2. Put simply - this means if any device in the connectivity chain is not HDCP 2.2 compatable you will not get a 4K 60hz UHD picture on your display.
For me this was a problem because my HDMI 4K UHD switch was not HDCP 2.2, and my LG 55EG960v refused the non-hdcp2.2 signal, so the shield TV would auto re-connect @ 1080p beacuse the connectivity chain was incompatible....
4K UHD Netflix requires HDCP 2.2, which is why it will only connect/work on the single HDMI input on your TV rated for 4k 60hz hdcp2.2...
The only way to guarantee a working 4K 60hz UHD signal is to ensure all your equipment is HDCP2.2 certified, or do some clever duplex routing.
Unitl this year HDCP2.2 equipment at reasonable prices was very scarce, but LIGAWO is a german manufactuer who seems to have just released a whole range of HDCP2.2 routers/switches/splitters etc (at reasonable consumer level prices).... I am sure there will be many more suppliers soon.
The rule of thumb I would use is - Unless the specification clearly states the device/equipment is HDCP2.2 compatable - don't go near it.....
I really, really hate DRM....
PS: It sounds like may not need to replace your expensive AV reciever, and what you want to achieve could be done through duplex routing. Happy to have a PM discussion if that would help. I currently run 5 consoles, 1 STB, 1Shield TV, 1 PC, and Chromecast audio, all with a Yamaha DSP Soundbar/Reciever (which is only HDMI 1.4 compatable), into 1 TV, and still can get 4k UHD 60hz + 7.1 HD audio.... So it can be done without replacing your AV reciever.
THIS IS AWESOME!! Thanks so much for this. Would you mind if I PM you later tonight when I get home from work? So would I also need a new router? Right now I have an Asus RTAC66U dual band. I also have a WD My Cloud EX4 that I stream videos from, would that need to be replaced also? I will PM you later if ok. Thanks again.
MintyTrebor said:
I have just overcome some 4k UHD issues with my Shield TV which are similar to the ones you described.
The main problem is HDCP 2.2....
This is what I found out.
Some time last year the HDCP standards were changed to require any 4K 60hz display to only accept a 4K 60hz UHD signal if it was wrapped in HDCP 2.2. Put simply - this means if any device in the connectivity chain is not HDCP 2.2 compatable you will not get a 4K 60hz UHD picture on your display.
For me this was a problem because my HDMI 4K UHD switch was not HDCP 2.2, and my LG 55EG960v refused the non-hdcp2.2 signal, so the shield TV would auto re-connect @ 1080p beacuse the connectivity chain was incompatible....
4K UHD Netflix requires HDCP 2.2, which is why it will only connect/work on the single HDMI input on your TV rated for 4k 60hz hdcp2.2...
The only way to guarantee a working 4K 60hz UHD signal is to ensure all your equipment is HDCP2.2 certified, or do some clever duplex routing.
Unitl this year HDCP2.2 equipment at reasonable prices was very scarce, but LIGAWO is a german manufactuer who seems to have just released a whole range of HDCP2.2 routers/switches/splitters etc (at reasonable consumer level prices).... I am sure there will be many more suppliers soon.
The rule of thumb I would use is - Unless the specification clearly states the device/equipment is HDCP2.2 compatable - don't go near it.....
I really, really hate DRM....
PS: It sounds like may not need to replace your expensive AV reciever, and what you want to achieve could be done through duplex routing. Happy to have a PM discussion if that would help. I currently run 5 consoles, 1 STB, 1Shield TV, 1 PC, and Chromecast audio, all with a Yamaha DSP Soundbar/Reciever (which is only HDMI 1.4 compatable), into 1 TV, and still can get 4k UHD 60hz + 7.1 HD audio.... So it can be done without replacing your AV reciever.
Click to expand...
Click to collapse
So does this mean 650 or above GPU is for the receiver or media player? When you say 650 does that handle 4K passthrough, as well as full 4K upscaling from 480/720/1080p content?
NiHaoMike said:
If you have a 650 or above GPU (Kepler), it supports 4K output.
Click to expand...
Click to collapse
In the PC you're using as a source, to rule out the Shield as the problem. And yes, any Kepler or newer GPU (including the one built into the Shield) will upscale all the way to 4K, although the smaller ones won't be able to handle the most advanced algorithms. But unless you're a hardcore videophile, you'll be hard pressed to tell the difference between how well a high end GPU upscales as opposed to how well a smaller GPU upscales. Not surprising given that image scaling is just one of the most fundamental parts of 3D rendering.
mikie00mike said:
THIS IS AWESOME!! Thanks so much for this. Would you mind if I PM you later tonight when I get home from work? So would I also need a new router? Right now I have an Asus RTAC66U dual band. I also have a WD My Cloud EX4 that I stream videos from, would that need to be replaced also? I will PM you later if ok. Thanks again.
Click to expand...
Click to collapse
I pm'd you some stuff.
Minty or Mike may I get a copy also please?
Minty I PMd you
Thanks!
mikie00mike said:
THIS IS AWESOME!! Thanks so much for this. Would you mind if I PM you later tonight when I get home from work? So would I also need a new router? Right now I have an Asus RTAC66U dual band. I also have a WD My Cloud EX4 that I stream videos from, would that need to be replaced also? I will PM you later if ok. Thanks again.
Click to expand...
Click to collapse
thruster999 said:
Minty or Mike may I get a copy also please?
Minty I PMd you
Thanks!
Click to expand...
Click to collapse
I have updated my config since I sent the details last, but I still duplex (or send multiple signals of the same HDMI source) to bypass the HDCP restrictions so I can continue to use my older soundbar. Picture with notes attached. Shout if you need anything more.
I just switched to using a new TV over the weekend that's (finally!) 4k.
If it matters, the TV is the bleeding new Vizio P65-C1 also a good write up on the TV can be found here.
When I plug the Shield TV directly into the TV via HDMI and go to the Shield's settings then HDMI, it detects and shows (Recommended) next to 4K 60hz, which is what I set it to.
I can play YouTube 4K videos, I can watch anything I want without issue on Plex or Kodi, I can play all of my usual Shield exclusive games like Doom 3 or Portal without issue but for whatever reason, Netflix will just not play anything. Not just 4K/UHD content, just any content. I can move around through the Netflix menus and browse for what I want to watch without issue but when I try to actually play something it has no audio and freezes and might jump forward in broken increments or will be 100% frozen on the screen. I can stop it without issue and the app is otherwise responsive and not frozen.
I can Cast Netflix content to my Shield TV and it works without issue also. Correction, I cannot cast Netflix to the Shield TV. Same issues as trying to play locally. I was accidentally casting to my TV directly, which does work without issue. Both my TV and y Shield TV are hard wired to my router directly via ethernet.
It's only when I try using the Netflix app that is built-in.
The thing I noticed when I Cast vs the native app is it shows a loading progress screen and can take a few seconds depending on the contents quality to hit 100% then start playing. It seems like the native Netflix app is not trying to cache the content and seems like it's trying to direct stream it, which it cannot keep up with for some reason.
The same happens with cartoons, older shows that at best are available in 1080P like X-files or of course UHD/4K content. There is no different in the behavior between the different source videos in the native app.
I have already created a support request directly with Nvidia but wanted to see if others have encountered this as well and have any suggestions on what I can try to fix it?
Correction, I cannot cast Netflix to the Shield TV. Same issues as trying to play locally. I was accidentally casting to my TV directly, which does work without issue. Both my TV and y Shield TV are hard wired to my router directly via ethernet.
Hey man... Finally got some sun in Atlanta?? What a nice day....
Anyways...see the link below I answered for somebody else, see it applies to you. They key is HDCP2.2 compatible equipment all along your video signal, Shield (which is), receiver, TV input. If any isn't, it won't work.
http://forum.xda-developers.com/shield-tv/help/netflix-4k-issue-t3291720
MrBungle67 said:
Hey man... Finally got some sun in Atlanta?? What a nice day....
Anyways...see the link below I answered for somebody else, see it applies to you. They key is HDCP2.2 compatible equipment all along your video signal, Shield (which is), receiver, TV input. If any isn't, it won't work.
http://forum.xda-developers.com/shield-tv/help/netflix-4k-issue-t3291720
Click to expand...
Click to collapse
It was a beautiful pollen coated day today
I checked and sure enough the TV does support HDCP2.2. Though it also notes that while it's HDMI 2.0, it's not enabled yet and will be coming soon in a firmware upgrade. Makes me wonder if that'll fix this once it's available.
I'm still going back and forth with Nvidia as well but so far nothing other than answering some questions and sending them some logs.
From your TV user manual:
When connecting a HDMI 2.0 device, HDMI Color Subsampling needs to be enabled to support 4K resolution at 60hz.
reTARDIS said:
Though it also notes that while it's HDMI 2.0, it's not enabled yet and will be coming soon in a firmware upgrade.
Click to expand...
Click to collapse
Statement not correct... It support HDMI 2.0 out of the box. It's the HDMI 2.0a support that will be coming via a firmware upgrade.
the.teejster said:
From your TV user manual:
When connecting a HDMI 2.0 device, HDMI Color Subsampling needs to be enabled to support 4K resolution at 60hz.
Statement not correct... It support HDMI 2.0 out of the box. It's the HDMI 2.0a support that will be coming via a firmware upgrade.
Click to expand...
Click to collapse
Ah, good catch. I'm not as up to date on the HDMI specs/standards these days.
I'm guessing then that HDCP2.2 should work out of the box though but will contact Vizio support to confirm.
Just curious...have you tried different HDMI inputs?
"The new P-Series models will include: four HDMI 2.0a/HDCP 2,2 inputs, plus one additional HDMI that handles 4k/60p at 4:2:0, although it is listed as HDMI 1.4, Vizio told us. The HDMI 2.0a inputs will allow the Vizio displays to connect to new Ultra HD Blu-ray players to stream 4K with HDR10 metadata, something earlier Vizio 4K UHDTVs have lacked"
Are you connected to that "one" input?
MrBungle67 said:
Just curious...have you tried different HDMI inputs?
"The new P-Series models will include: four HDMI 2.0a/HDCP 2,2 inputs, plus one additional HDMI that handles 4k/60p at 4:2:0, although it is listed as HDMI 1.4, Vizio told us. The HDMI 2.0a inputs will allow the Vizio displays to connect to new Ultra HD Blu-ray players to stream 4K with HDR10 metadata, something earlier Vizio 4K UHDTVs have lacked"
Are you connected to that "one" input?
Click to expand...
Click to collapse
According to Vizio's support "The tv supports HDCP 2.2 on all HDMI ports" so I'm not sure. The manual does not detail specific ports at all for HDCP.
@reTARDIS I just bought the same TV and am seeing the same issue. I noticed that if I change the Audio setting for the show I'm watching from "English 5.1" to "English" the show streams correctly. I'm not sure why this is happening either.
oracleicom said:
@reTARDIS I just bought the same TV and am seeing the same issue. I noticed that if I change the Audio setting for the show I'm watching from "English 5.1" to "English" the show streams correctly. I'm not sure why this is happening either.
Click to expand...
Click to collapse
Interesting find. I'll test that on mine this evening.
@oracleicom I just tried this and confirmed it works for me as well. I send an update to Nvidia with this info to see what they have to say. Thanks for the workaround. It's not perfect but will do for the short term.
I would make sure you have a HDMI 2.0 cable (18Gb/sec), that's the only one that supports 4K @ 60fps with 4:4:4 chroma (color).
Use a port on the tv that supports HDMI 2.0.
I am having the same issue with Nvidia Shield and and the same TV. I was so excited when I got it today. I can also make it go with changing it english from 5.1. I had the Shield hooked up to a previous Vizio 4k TV and there was no issue. I am talking to Vizio support now and they want to blame the Shield even though it works fine with their previous 4K TV.
I also tested with my Nexus Player and it has the same issue.
@reTARDIS Did you ever get a reply from anyone at Vizio or Nvidia? On mine, everything but Netflix works fine, so maybe it is something that Netflix can fix in their Android TV app.
Correction, both Google Play Movies & TV and Netflix have the issue. Nvidia is saying it is Vizio's issue and Vizio is saying it is Nvidia's issue. I pointed out the Nexus Player has the same issue with the new Vizio and they have not replied.
reverik said:
@reTARDIS Did you ever get a reply from anyone at Vizio or Nvidia? On mine, everything but Netflix works fine, so maybe it is something that Netflix can fix in their Android TV app.
Click to expand...
Click to collapse
They've really not given any indication that they have a clear understanding on what is happening and every time I ask for an update, if they reply at all, it's just to say that they're still researching the issue.
They recently suggested that I test the following.
SHIELD TV HDMI out -> Vizio P65-C1 -> Vizio P65-C1 HDMI ARC to Pioneer SC-1522K
Initially, I didn't think this would work as I had thought TV's did not output anything via HDMI and could only input but after doing some further reading today into HDMI and ARC, I think it might work to output just audio from my TV via HDMI to the receiver using one of the HDMI ports that supports ARC but I have not yet tried this.
As a backup solution, I just ordered this USB to Optical audio adapter last night that others have said works with the Shield TV to output audio via USB to TOSLINK. The device includes a TOSLINK to SPDIF adapter. I plan on just using this to output SPDIF directly to my receiver to bypass the Vizio TV to see if this resolves the issue or not. I will update later once I've tested it out.
I'll report back my findings with both test.
Update: The same issue exist with the USB to SPDIF/Optical adapter straight into the SPDIF/Optical input on my receiver (bypassing the TV for audio).
I'm at a loss for what to try next and the issue has not gone away.
Update: I figured out how to make ARC via HDMI work from my TV to my receiver over the weekend and am now only using audio out for all devices connect to my TV via HDMI then out via the ARC HDMI port on my TV (HDMI 1) and into the ARC input on my Pioneer receiver. While audio works without issue for any connected device and the Shield TV in general, the same issue exist as before with 5.1 audio playback from Netflix on the Shield TV.
I also did a factory reset on my Shield TV this past weekend but sadly it didn't change/fix the issue which still persist.
I just cast to my TV now for Netflix and avoid using the Netflix app on the Shield TV.