The EBU (European Broadcasting Union) is urging its members, mostly public service broadcasters, to apply more rigorous tests for vulnerabilities as they migrate to IP infrastructures and deploy more OTT services.
Since most of them have been on this course for several years at least, it might be assumed they were already well versed in cybersecurity, but the EBU reckons that many have continued to rely on traditional pay TV revenue protection and not been sufficiently aware of the new more general enterprise cybersecurity threats to which they are increasingly exposed as they embrace IP.
Not surprisingly the content protection firms such as Verimatrix, Kudelski’s Nagra and Irdeto have been aware of these risks and started expanding their portfolios accordingly, but the EBU would be right to suggest they do not possess all the required tools and expertise themselves, which is why some of them have been collaborating with other parties to plug the gaps.
Meanwhile the EBU itself has published a short paper outlining threats and tests that broadcasters and content owners can employ to shore up their defenses and identify any vulnerabilities in individual components that need to be addressed. The principles have emerged in the world of IT and are employed during software and project development, but in practice as systems are integrated new risks and exposures continue to emerge, so that the process of threat monitoring is ongoing and never complete. It is true that such principles should be employed more thoroughly during product design and deployment.
The premise of the EBU paper entitled Minimum Security Tests for Networked Media Equipment is that as production workflows and infrastructures migrate rapidly to generic IP based IT systems, they expose connected devices that still tend to have a lower security threshold inherited from the era of non-connected broadcast media, where revenue protection was important, but enterprise cybersecurity protecting against malware, ransomware and DDOS (Distributed Denial of Service) attacks was not.
At the same time threats are becoming easier to execute with the help of readily available tools and are growing in scale while continuously evolving. Media companies should therefore not just implement the recommendations themselves but insist that their systems integrators, developers and vendors do so as well.
It is true some of the EBU’s recommendations do sound like teaching a chicken to lay eggs, such as encouraging and enforcing good password hygiene on users. However the EBU might argue that constant nannying is needed to ensure users do not employ default or simple to guess passwords and that service providers need to run regular checks at the network level. Broadcasters themselves can be guilty of failing to protect against brute force attacks on passwords and associated authentication systems, which are essentially just rapid trial and error. Such tests need regular repetition.
The paper then becomes more interesting when it touches on more sophisticated attacks and defenses involving in particular the various techniques under the heading of “fuzzing”, whose origins date back to the dawn of software development in the 1950s. Then when the only security was physical protection of the hardware the objective of fuzzing was purely to find bugs in software, but since then this whole range of techniques has extended to finding vulnerabilities, employed both by hackers or pirates themselves and security researchers.
Fuzzing today is used then in both debugging and vulnerability testing and involves transmitting deliberately incorrect or malformed input to some target component where relevant software is being run and observing the result. If the target behaves unexpectedly or crashes completely, then it has proved unable to cope with the exception and requires further investigation that may then unearth a vulnerability. In the case of a hacker that vulnerability can be exploited for malicious purposes, while for security staff or a software developer it can highlight what needs to be fixed.
Fuzzing is equally applicable to developers and customers, although the latter might have to report back to their software providers to apply a fix. In fact fuzzing should be employed throughout the software development lifecycle during implementation, verification, and release phases. It can and should also be applied regularly as indicated by the EBU during subsequent use when it can identify vulnerabilities, so called zero days, that went previously undetected and yet may at this stage affect the integrity of the system and bring exposure to attack.
The EBU identified two categories of fuzzing relevant for video infrastructures, protocol fuzzing and URL fuzzing. Protocol fuzzing is the larger category, having evolved into three levels of sophistication, starting with basic random fuzzing where a test case is generated without any intelligence. This is simple and fast to execute, but unlikely to be of value for a broadcaster, normally failing to penetrate the target and therefore obtain any result. However it is employed at early stages of testing immature code to identify basic errors before moving on to more sophisticated tests.
The second level is then called template fuzzing, sometimes known as block fuzzing or more likely mutational fuzzing, since it involves making slight alterations in the input by changing perhaps one byte, just as a point mutation in biology alters just one DNA base or “letter” of a gene. However unlike in biology this process continues by iterating across the whole input changing different elements in turn until the user is satisfied all the errors have been found, or at any rate enough of them.
This process still has considerable drawbacks, one being time and computational cost, another being that its focus is quite narrow being derived from just one input known to be correct. It can also be hard to manage the process as the input evolves.
To overcome these drawbacks a more sophisticated approach called generational testing has emerged, also known as model-based fuzzing or just intelligent fuzzing. This is more structured and based on the RFC Data Model describing complete data objects that are handled as single entities, so the idea here is to avoid large numbers of redundant tests that change just parts of an object or even individual bytes. It creates malformed input relating to the structure of a protocol and produces controlled randomness capable of relatively exhaustive testing with far fewer iterations, without wasting time repeating tests that have effectively been done already with essentially the same input, give or take the odd irrelevant byte.
The EBU also advocates URL fuzzing, which involves various tools for discovering resources or information that was not meant to be publicly accessible. This could be backups, index or archive files, which can contain sensitive information protected only by “obscurity”. Since the same tools are available to hackers, it is not a good idea to rely on obscurity for security, as the EBU observes.
One point that the EBU missed, perhaps deeming it beyond the scope of the short paper, was that many of these vulnerabilities will become more prevalent and fall within the domain of broadband and pay TV operators if not public service broadcasters as the consumer Internet of Things (IoT) proliferates. This was explained in a paper called The State of Fuzzing 2017 by California based Synopsys, probably the world’s largest design automation software group with annual revenues almost $3 billion. This paper noted that when grouped by industry vertical, the protocols demonstrating the least maturity and therefore the highest number of failures over the shortest period of time so that they posed the greatest risk were found in Industrial Control Systems (ICS). This sector favors its own niche protocols, which have not been well tested and are now finding increased use in the Industrial Internet of Things. Many of these protocols are also present in the consumer Internet of Things (IoT) and gaining exposure now that IoT software is becoming ubiquitous and must intercommunicate.
This could be a time bomb for vulnerabilities already present with potentially profound consequences, according to Synopsys. Handheld devices and home sensors could become compromised if these protocols were used, but not implemented or tested thoroughly. As IoT expands, it is important that vendors start testing for unknowns if they haven’t already, because many of the firmware-based personal IoT systems may be hard or impossible to update post-release, according to Synopsys.
All the more reason for video service providers to heed the EBU’s advice now and get their heads around fuzzing.