How loud is your stream?

Below is a sample of data I took recording popular news channels for 3mins exactly and then analysing the results. This was done between 0530g and 0614g 01/04/12 here was the test subject list of freely available news streams


http://edition.cnn.com/video/

http://www.ndtv.com/video/live/channel/ndtv24x7

http://live.foxnews.com/

http://www.aljazeera.com/watch_now/

http://www.bbc.co.uk/news/10318089

http://www.bloomberg.com/tv/europe/

http://www.euronews.net/news/streaming-live/


Here was my process


All was done on my iMac in OSX (10.7.2)All testing never left the digital domain.
All audio file settings/encodes were 48000/16bit/Stereo. All playback locally used quicktime player 10.1 All web playback used safari 5.1.1


1)  Create - 18dbFS file in adobe audition
2) Used http://www.audiofile-engineering.com/spectre/ to confirm that playback level was -18dbFS on a digital meter and 4PPM on a BBC meter when played back and all audio controls set to max (all checked out)
3) Created video file in iMovie of a -18dbFS tone and uploaded to youtube http://www.youtube.com/watch?v=tziQZGjIspM
4) With the youtube flash player volume set to max, played back and calibrated my flash player to the spectre meters (only needed about -1.5db adjust, not sure why yet)
5) Went through popular news channels with their flash player volumes set to max, auto recorded 3min files to a PCM wav file, also did the same with the calibrated tone from the youtube player.
6) Put the .wav files through a tektronix cerify file checker, obtained results and plotted on graph below.
So I think its a fair test, as I ensured a unity path at all stages and times and recorded all files with the same setup.


As you can see CNN, Bloomberg, Aljazeera and BBC all look to group well as you would suspect. The 2 other channels NDTV and Euronews I have no conclusion yet.I was thinking I would see more of a spread of results on the internet streams as opposed to TV baseband audio levels… but I guess and hope this is following baseband TV levels, and I guess with modern encoders there is no real difference between baseband TV levels and the internet stream levels.


BTW
a) Im assuming -18dbFS ref level….. they could be -20/24 or anything really.
b) Im assuming the flash players all have 0dbFS at their max setting.
c)) This was only a specific sample window I would need more samples to be a lot more accurate.


Find that asset - PowerShell Script - Windows

Managing media assets is great nowdays, all these webUI CMS type applications, however, when trouble shooting a backend system I often find myself needing to locate specific files in a number of locations quickly. A collegue of mine came up with a nice little .bat script to search for a file name in a bunch of different server locations. However it was all hard coded and didn't format nicely on the screen.

I decided to pimp it up with a nice powershell script.....

Essentially at the command prompt use the format

./find_files_in_broadcast_servers.ps1 -f "yourfilename_or_wilcard" e.g. ./find_files_in_broadcast_servers.ps1 -f "catplayingpiano.mpg" ./find_files_in_broadcast_servers.ps1 -f "*cat*"

 

 

# Script to search files in locations listed in locations.txt #
# Usage - ./find_files_in_broadcast_servers.ps1 -f "yourfilename_or_wildcard" #
# Grab command arguments from command line - filename #

param([string]$f = "filename")

# Open locations.txt into $a for each search location (one per line in txt file with CRLF at end #

$a = Get-Content "C:\scripts\powershell\locations.txt"

# Use foreach loop to step through locations in locations.txt #

foreach ($i in $a)

    # Print seach location to console window in green #

    {Write-Host -ForeGroundColor green $i

    # Search for file, recurse subdirectorys with the root location specified in locations.txt, format in table, sort by name #

    dir $i -recurse -filter $f | sort name | ft directory, name, LastWriteTime}

# END #

You will need to create a locations.txt file in the location specified on the Get-Content line with all of your locations specifed. Use the windows UNC format. For example.... locations.txt..... \\server1\location1 \\server2\location2 \\server3\location3 \\server4\location4

 

Tip: you may need to set PowerShell security to run unsigned scripts, there are various different options dependant on your security needs.

See

PowerShell Security

 

Neil (themoog)

If in doubt just add layers

In a previous job I used to work as a CAD operator, a few years later when I was working in broadcast enginneering I was looking at creating some more standard Autocad templates and blocks for broadcast drawings. Ive never really got around to it, each time I do a drawing I just modify an old one from an existing template a colleague of mine made.

So ive decided to get back on it and create myself a better, more automated workflow.

One of the first things I was looking at is how to standardise layers. A few google searches later I found that the AIA (American Institute of Architects) had a system that seemed perfect (AIA CAD Layer Guidelines). With a few modifications I came up with the following….

themoog’s layering standards for broadcast Autocad drawings

As recommend by the 1997 AIA CAD layer Guidelines, layer names may be as short as six characters (discipline code + major group) or as long as sixteen characters (discipline code + major group + minor group + status). See bellow for examples.

B-VID = discipline code + major code
B-VID-SDI = discipline code + major code + minor code
B-VID-EXST = discipline code + status code
B-VID-SDI-EXST = disincline code + major code +minor code + status code

Discipline code

The discipline code is a two character field with either a hyphen or user defined modifier.

Discipline Code Discipline
A architectural
B broadcast
C civil
E electrical
I interior
M mechanical
S structural
T telecom

Major group

The major group filed is a four character field that identifies the system such as VIDEO, AUDIO etc…. Although most major groups are logically associated with specific discipline codes, it is possible to combine major group codes with any of the discipline codes. For example, B-VID, T-VID

Major Description
VID video
AUD audio
ROLL rollcall (S&W)

Minor group

This is an optional four character field for further differentiation of major groups fro example, B-VID-SDI, B-VID-CVBS

Minor Description
SDI serial digital video
CVBS composite video burst and sync
ABAL analog balanced audio
CBLN cable numbers
CABN cab numbers

Status field

The status field is an optional four character designator that differentiates new builds from remodeling and changes. For example B-VID-SDI-EXST

status description
NEWW new work
EXST existing to remain
DEMO existing to demolish
FUTR future work
TEMP temporary work
MOVE items to be moved
RELO relocated items
NICN not in contract
PHS1–9 phase numbers

Annotation

annotation description
*-ANNO-DIMS dimensions
*-ANNO-KEYN keynotes
*-ANNO-LEGN legends and schedules
*-ANNO-NOTE notes
*-ANNO-NPLT non-plot info e.g. viewports
*-ANNO-REVS revisions
*-ANNO-TEXT text
*-ANNO-TTLB title blocks and sheet borders

Colours

As a general rule for all projects, drawing entities should assume the colour of the layer in which they reside. This means all colours are applied by ‘layer’ not ‘entity’

layer colour line-weight line-type
B-VID-SDI SDI Video 6-magenta continuous
B-VID-CVBS CVBS Video 3-green continuous
B-AUD-ABAL Analog balanced Audio 5-blue continuous
B-AUD-AUB Analog Unbalanced Audio 5-blue continuous

Colour Codes

code colour
1 red
2 yellow
3 green
4 cyan
5 blue
6 magenta
7 white
8 -
9 lt-grey

THE END

BTW, if anyone has any better solutions i’m still happy to hear, also I really want to know if anyone has any good cable numbering systems!

Virtual Waveform Monitor

    

 

 

 

For a quick and dirty project to learn processing I decided to write a small application to teach how RGB, color difference levels and vectors are displayed on a PAL Video scope.

Using three sliders you can pick your colour and see how it would be represented on a real PAL waveform monitor.  Useful for understanding how the RAW colours translate into real world measurements.

Because i'm happy with the learning curve I haven't added any graticules or measurement options. However if people find it useful i'll develop it more.

MacOSX, Windows, Linux

Download it here [download id="4"]

How long will this take? Latency in broadcast control

Latency in control systems for broadcast applications is a variable in producing smooth television. When you start combining the idea of controlling live systems over large distances this problem increases somewhat.

So what is acceptable latency? Well its hard to say, in live television when you want to move a camera live on air then the smaller the better. But what is the best we can get using todays technology?

This post is an exercise to find out what the shortest possible RTT (round trip time) is for controlling a device and seeing the result to complete the feedback loop for the operator.

Here is an example of moving a camera live on air, when the operator is based in London and the studio based in Hong Kong.

I will preface this in saying this is something I have had to actually develop and build. I think a Europe to Asia example should detail a worst case scenario anywhere in the world.

All timing assumptions are based on SD video using h.264, your mileage will vary (a lot) by using other types of video codec.

Latency - London <> Hong Kong

Purple - The Operator

Here we look at the cognitive psychological process.  There is some evidence  to say it takes 300ms to recognise an object http://www.themoog.org/opn so we should add in some "reaction time".

A quick google search brings up "The reaction time tester" http://www.themoog.org/stm.

Here you are asked to hit the mouse button when the green light comes on.  I scored 0.2942ms over an average of 5 tries.

Online Reaction Time Test

Of course there are many other factors, such as communication overload, skill of the operator, training etc... But this is meant to be a best case study so I'll use 300ms for ideal circumstances

Blue - Video Encoding

Well seems the standard nowadays is h.264 over TPC/IP.  The RAW encode/decode process spec'd on a D9093/D9094 cicso IP codec is 580ms for PAL and 550ms for NTSC in IP low mode. Why IP low mode? Well in practical experience this is the lowest latency setting achieved causing no problems in the picture.  Any lower and you start seeing frames drop

Cisco Encode / Decode times

Green - The decode process

Well I've sort of lumped the h.264 encoding / decoding latency together in the section above, but we should still add another 40ms to take any video processing (such as video synchronisers or standards converters) into account.

Yellow - TCP/IP network latency

I'm lucky and have access to a dedicated network. Using the standard ICMP ping tool available in any OS  I get about 300ms RTT (round trip time) between London and Hong Kong. 40ms RTT between London and Milan.

Over the Internet using http://www.pingtest.net/ i get the following result between my house in the UK and a server in Taiwan (that's the nearest server  I could get to Hong Kong)

Pingtest.net - The Global Broadband Quality Test

So I think best case is 300ms.

What makes up the network latencies? Well a number of factors.

Using http://www.wolframalpha.com/ I get the two following results.

london to milan - Wolfram|Alpha
london to hong kong - Wolfram|Alpha

The key piece of information here is speed of light in fibre. Now of course optical fibers don't go in a straight line, and also light has to be re-clocked and regenerated. But this is pretty much the best time your ever going to get (theoretically speaking)

The rest of the delay is made up by packet inspection routers and router packet queuing.

Network Jitter is a very important subject, but is not in the scope of this post

Conclusion

So what is the best RTT latency time I can expect?

Well to to visualise this I created the video below, using audio tones to represent the times shown in the flow chart diagram at the top of this post.

 

Using current technology with real world values I think expecting any less than 1740ms  is unlikely. Especially at the video quality that broadcasters expect.  Remember too I have only used SD video as an example, expect HD to increase this time. I also had the luxury of  using a very good network, if your using the Internet (which has its own QoS issues) your at least looking at adding 200ms.

The option of using another low quality return video feed for just monitoring the video feedback may also lower this time, but using 2 telco circuits will increase the cost.