New Vidchat Platform – What You Need to Know

Here are some things to be aware of about the upcoming new vidchat platform for member vidchats:

First, it will require some testing to implement. It may or may not be ready for the next show, and it's unlikely we'll release it over a holiday week. It's in the works, but it won't be immediate. We estimate it'll be live within 60 days. But keep in mind, that's subject to testing. Nothing smart goes out without tests, so we're not promising anything at this point.

Second, it will require some learning. It won't be the old system, or work exactly like the old system did. We're not spending 500K on a custom platform (have to make memberships far more expensive for that), so we'll be using a platform that meets several of our key needs, like capturing recordings, permitting unlimited viewers, and providing a means for Q&A.

Third, changing platforms may or may not resolve latency issues in some broadcasts. Here's exactly why:

There are 3 parts to any such system:

  1. the internet bandwidth for the broadcast (Dr. Farrell’s internet connection)
  2. the broadcasting machine (Dr. Farrell’s PC)
  3. the receiving network (the platform where the video broadcast is being hosted)

Part of the issue may not be the current platform at all, which has 24 million users and about 14million live streams per year. There are a ton of successful broadcasts happening on the current platform, as you read this. It may not be particularly tolerant of hiccups in bandwidth or conservative system resources, but before we dump everything at the feet of the current platform, that would be like blaming Ford because your truck doesn't run well on 4 out of 6 cylinders.

In an overwhelming number of cases, when there is latency in a video broadcast, it is latency in the broadcaster’s internet connection (latency or packet loss or upload speed – or some combination of the 3). Those 4 factors are absolutely huge when uploading video live. Raw download speed isn't the big issue. Obviously sending video over the internet requires maximum stability in an internet connection. Any hiccups on the sender’s end become hiccups for all receivers. If *all* receivers are experiencing and reporting the same issues, this is highly likely to be at least part of the cause.

Another possible cause is a pc/system issue (computer resource management) on the broadcaster’s end (e.g. current free RAM, % of CPU use, disk speed [for video processing], and available disk space during the broadcast, graphics processor RAM and speed (which is separate from other RAM), and whether the graphics processor is integrated (cheap and uses the CPU as its brain) or standalone (has its own brain). Those factors or any combination of them determine how likely it is that the video recording will be smooth. Any one of them could contribute to a poor broadcast.

Of course, Giza Support doesn’t have a local technician on site with the host for the broadcasts (our scope is supporting technology on the web site, not technology resident in his office), so we can’t verify that those were contributing factors on Black Friday. We have seen internet latency issues from the same location in the past, however.

The new platform will, like all platforms, be imperfect. However, we cannot promise nor necessarily expect that it will resolve all issues. Worst case scenario, it may do nothing more than remove ads and reveal that the issues are not actually platform specific at all. Or it may be *both* the platform and the the system or connection of the broadcaster. In which case if we see similar issues in the new platform, it falls to the show’s host to actually resolve technical issues related to connectivity or system resources with local professionals that Giza Support can refer but which are out of our reach.

Hoped for Middle ground: the new platform may be more tolerant of a system with internet latency or packet loss issues or system resource issues. If it's actually designed to accommodate sort of an 'everyman' broadcast of great length, it will likely do so by imposing a) a slight drop in video resolution and sound definition and b) an air delay to allow for processing, just like on a live news broadcast. The middle ground would be good. It doesn't mean we wouldn't get *better* broadcasts if the uploading source had fewer latency issues, but the increased tolerance for them would be welcome. Signs are good we might hit this middle ground.

So the bottom line is we ask you to bear with us while we test the new system, and bear with learning how to use it vs. the old one, when we roll it out, and neither set expectations too high (it's not CNN), but keep your fingers crossed that it turns out to be more practical.

In the meantime, some tips:

The recorded version from Black Friday will have the same issues as the live broadcast, because it's a recording of that broadcast. This actually demonstrates that the issues lie in the connection between the sending computer and the platform provider's servers, though it doesn't say on which end. We think the Black Friday session is a throwaway. If you found it useful - great. If not, we're calling in a pass.

If you're a technician capable of providing local / remote support in the following 2 areas:

  1. diagnosing local internet latency, line quality, packet loss, and upload speed issues, making recommendations to resolve, and followup testing to monitor and verify ongoing results
  2. diagnosing system resource clogs specifically during live video broadcast streaming from a PC (current free RAM, % of CPU use, disk speed [for video processing], available disk space [for video pre-processing], graphics processor RAM and speed or GPU/CPU interference

Feel free to reach out to Dr. Farrell directly, if you'd like to volunteer to be a local/remote support person. Facility in correctly assessing all of the above areas of concern would be essential, as well as availability before and perhaps during broadcasts. The fact is, we can switch platforms but, without someone remotely supporting the broadcasting unit, we've only changed out a part in that Ford and hoped it did the trick.

Posted in

Giza Developer

Business and Technical Developer of the Giza Community.

2 Comments

  1. gizadeathstar on November 29, 2014 at 12:15 pm

    The thing is, when Dr. F hits the record button, it’s actually an increased interaction between his pc and the receiving server. It’s not just that the receiving server is doing all the heavy lifting of recording. It’s that his own system is doing some sort of pre-processing for it. I suspect what’s happening is he’s broadcasting one stream to be viewed and a 2nd stream to be recorded.

    Let’s say he becomes the recorder. What that does is deeply tax his existing system resources that are currently being used for broadcasting to viewers. Again there are 2 streams, only now he has even less resources for the broadcast as his system doubles as the video processor in full, capturing using the same video GPU that’s broadcasting, chewing up his RAM the longer the show gets, and filling disk space and using disk activity that the actual broadcast needs. Problem is repeated, just with slight variation of flavor.

    In other words: If we add screen recording to the mix (I know, I’ve done what you say), it will further drain his system resources to the point that now he is the receiving server, and it may not resolve the issue.

    Plus, unlike me, he’s not set up with multiple monitors, so he can actually manage the broadcast, chat, and recording all at once.

    It’s a good idea, and it’ll be a possible fallback position if nothing else works and we have to do it that way, but actually the new system we hope will mitigate the problem. It will record *directly* to youtube servers, and actually place the live broadcast on a delay of some 20 seconds or so. That means it may be much more tolerant of bandwidth and system resource issues, and my experience with it has been that it’s fairly tolerant. Effectively, the stream goes out to one receiver, and that receiver *then* hands it off to a recording server and a broadcast server. I think maybe what ustream is doing is trying to make Dr. F broadcast to both at once and do some preprocessing on his end. Can’t prove it, but we’ll soon find out. 🙂

    Ultimately though, no matter what, the recordings will never be stellar until/unless he has the cleanest possible connection (no latency, no packet loss, no line quality issues), really good upload speed (download speed be d*mned), and a computer system geared for video broadcast – ideally – lots of free RAM, lots of free disk space, a really fast internal main drive, a *separate* video card instead of an integrated one, so it has it’s own brain/GPU and doesn’t borrow from his computer’s CPU to do most of the processing, and good RAM on the video card, not just for the computer itself. Personally I use 2 standalone 2gb video cards for my broadcasts, and they serve me well. Running 4 monitors. Even one such card would be great if he doesn’t have at least that much. And at least 16gb overall system RAM, and at least a 7200rpm main hard drive with at least 50% free space at any given time. Then a super clean internet connection with fast upload speed. Those things and the new platform could make it super stellar. 🙂

    But starting out, we’re getting to the new platform as soon as possible. A more tolerant system would help accommodate less than ideal hardware and connection.



  2. kfitzgerald60 on November 29, 2014 at 11:48 am

    Daniel – there is rarely an issue streaming the pre-vidchat portion of the show. Only when Joseph starts to record do the problems appear. Would it be possible to record the U stream broadcast with a separate program i.e. screen capture software on Joseph’s computer or a remote computer, and then upload the video to YouTube?



Help the Community Grow

Please understand a donation is a gift and does not confer membership or license to audiobooks. To become a paid member, visit member registration.

Upcoming Events