Installer Steam
Logg inn
|
språk
简体中文 (forenklet kinesisk)
繁體中文 (tradisjonell kinesisk)
日本語 (japansk)
한국어 (koreansk)
ไทย (thai)
Български (bulgarsk)
Čeština (tsjekkisk)
Dansk (dansk)
Deutsch (tysk)
English (engelsk)
Español – España (spansk – Spania)
Español – Latinoamérica (spansk – Latin-Amerika)
Ελληνικά (gresk)
Français (fransk)
Italiano (italiensk)
Bahasa Indonesia (indonesisk)
Magyar (ungarsk)
Nederlands (nederlandsk)
Polski (polsk)
Português (portugisisk – Portugal)
Português – Brasil (portugisisk – Brasil)
Română (rumensk)
Русский (russisk)
Suomi (finsk)
Svenska (svensk)
Türkçe (tyrkisk)
Tiếng Việt (vietnamesisk)
Українська (ukrainsk)
Rapporter et problem med oversettelse
The dark blue line is the time taken to encode the frame (too high = weak CPU on host)
The light blue line is then the time taken for that frame to be transmitted over the network (too high = weak network)
The red line is the time taken to decode the frame and draw it (too high = weak CPU/GPU on client depending on which is being used as decoder)
Since its a stacked graph you can pretty much read off the red line as being the sum of the *additional* latency added by the streaming. The F6 display also shows the *total* latency, which is the sum of the input latency (time taken by the host to receive data from the controller) + time taken for the host to generate a frame (game latency) + display latency (latency resulting from streaming as shown on your graph's red line).
The other specs just show what resolution the capture is, how much network bandwidth Steam thinks is available and how much it is using and the overall frame rate. A bad network link will give you dropped packets, and a slow decode on the client side will give you dropped frames which are also shown.
IMHO I think the best sequence for optimisation is to reduce the frame rate first, then the capture resolution - going from 60fps to 30 halves the amount of data being captured, sent over the network and drawn. This will have a massive impact on your latency. Going from 1080p to 720p will also drastically reduce your latency. Network bandwidth only really affects the image quality. I think for 720p @ 30fps it seems to want to use about 15mb/s for me at least. I don't know if higher compression rates (lower bandwidth allocated) affect capture latency, I haven't measured it.
It's also worth bearing in mind that the streaming process is going to reduce your game frame rates overall on a weak rig. If you're CPU bound in some games you're going to have a bad time (as I've found out mucking about with Dolphin and PCSX2 on my ancient Core2quad).
HTH
Dave
Cheers
MB = MegaByte
Mb = MegaBit
mb = MilliBit = makes no sense because noone ever uses that
makes it easier to spot potential bottlenecks
how many (( ms )) makes a difference that we can notice .
is 20ms too slow or too fast?
is 30ms considered SLow?
How many ms does it take to be lag?
Whats noticeable and acceptable would probably depend on the person and the game. FPS's, for example, need much lower latency than platformer/adventure type games (which would also need much lower latency than a turn based strategy game, for example). Likewise, everyone is different and will likely have varying levels of sensitivity to it.
Another thing I've noticed is that when you're using a controller, you don't tend to feel the lag nearly as bad as when you're using a keyboard and mouse.
However, with the current state of IHS, I honestly can't feel lag at all anymore in most games. It's come a long way in a short amount of time. Using both Wireless N and/or a wired gigabit connection, I've been getting great performance with almost no perceivable lag and a solid 60fps... which used to be pretty severe over wireless when it first launched.
what ms is that you feel is too much pick a number please between 1ms -100ms.
Just to be sure, if my red line is high, i need a better client? If I get the Steam Link, will that solve it?