NHS bodies should invest in new IT infrastructure, says NHSX official

https://www.governmentcomputing.com/healthcare/news/nhs-infrastructure-nhsx Government Computing GC Staff Writer Several NHS organisations will have to give importance to investing in replacing old IT infrastructure and networks to realise the ambitions of the NHS Long Term Plan, NHSX associate CIO Rob Parker has said. NHSX is a government unit with the responsibility for setting national policy and developing best practice for NHS technology, digital and data. Addressing the annual Northern, Yorkshire and Humberside NHS Directors of Informatics Forum’s (NHYDIF) conference in York, Parker also talked about the new digital plans that are expected to be prepared by the integrated care systems (ICSs). Parker said that ICSs will have to spend the coming years “paying the collective price on accumulated technical debt” to be able to progress to deliver joined-up digitally enabled patient care. Affirming that digital investments must be linked to system-wide outcomes, Parker said: “We’ve seen the formation of ICS and system vanguards that have so far been largely in isolation from technology. “I don’t know how many investment funds we’ve had in particular areas or capabilities, including tech funds, provider digitisation, GDEs, LHCR, e-rostering, e-prescribing and diagnostics, but we now need to move to system-wide enablers. “We now need to link digital maturity and transformation to overall service transformation.” The NHSX associate CIO showed slides indicating that the new ICS plans are expected to cover core areas such as infrastructure, improving diagnostic services, digitising providers, resource and activity management; integrating health and care data records and workflows; patient-facing services; and population health analytics and intelligence. ICSs will be expected to classify their capabilities in these areas into basic, developing and advanced. He said that the centre will score the ICSs on their collective digital maturity. Furthermore, Parker confirmed that linking technology to the ambitions of the long-term plan will form a key part of the digital plans that the new ICSs are expected to draw up. Parker said: “This creates a massive opportunity over the next for 3-5 years for us in the centre and locally delivering the digital tools to deliver those key enablers real. The move to ICS’s is a chance to massively increase scale of what we’ve been doing.”

Public trusts NHS most on data ethics

https://www.ukauthority.com/articles/public-trusts-nhs-most-on-data-ethics/

UK Authority

Mark Say – Managing Editor

The NHS is outstripping central and local government in winning public trust for how it uses personal data, according to findings from a new survey.

The Open Data Institute (ODI), working with pollster YouGov, has tested public attitudes on the issue, revealing that the health service came top but the financial sector beat most other parts of the public sector on perceptions of using data ethically.

Of the 2,000 adults questioned, 59% said they trust the NHS to be ethical in its use of their data, the only sector to score more than 50%. However, this shows a decline since a survey focused on healthcare data in early 2018, which recorded a score of 64%.

In the new exercise, emergency services scored 47%, banks and building societies 42%, local government 31%, central government 30% and universities 18%. Family and friends came out well off the top at 34%.

Other findings included that 44% feel that government and regulators should be most responsible for making sure personal data is handled ethically.

Necessity factor

In identifying the key factor in ethical behaviour, a small majority of 52% pointed to only collecting the necessary personal data to provide a service.

The survey also pointed to a split in how well people feel they understand data protection and the General Data Protection Regulation (GDPR), with 57% saying they understood it fairly or very well, and 41% saying not very well or not at all.

The ODI highlighted the rising importance of data ethics as organisations rely more on data to improve their operations and personalise services. The general feeling is that the roll out of regulations such as the GDPR and media coverage has raised people’s awareness of their data rights and the potential for misuse.

Expectations

Jeni Tennison, the ODI’s chief executive, said: “The survey shows us that people quite rightly expect organisations to use their personal data ethically.

“The survey shows us that people quite rightly expect organisations to use personal data ethically. Organisations need to respond to their concerns and be more trustworthy in how they collect and use personal data. This is not only the right thing to do, it will help organisations to keep benefiting from the data they rely on and retain the trust of their customers and employees.

“Talking about using data ethically is not enough, organisations need to publicly demonstrate how they do this in order to build trust.”

Image from iStock

History as a giant data set: how analysing the past could help save the future

https://www.theguardian.com/technology/2019/nov/12/history-as-a-giant-data-set-how-analysing-the-past-could-help-save-the-future

The Guardian

Laura SpinneyTue 12 Nov 2019 06.00 GMT

In its first issue of 2010, the scientific journal Nature looked forward to a dazzling decade of progress. By 2020, experimental devices connected to the internet would deduce our search queries by directly monitoring our brain signals. Crops would exist that doubled their biomass in three hours. Humanity would be well on the way to ending its dependency on fossil fuels.

A few weeks later, a letter in the same journal cast a shadow over this bright future. It warned that all these advances could be derailed by mounting political instability, which was due to peak in the US and western Europe around 2020. Human societies go through predictable periods of growth, the letter explained, during which the population increases and prosperity rises. Then come equally predictable periods of decline. These “secular cycles” last two or three centuries and culminate in widespread unrest – from worker uprisings to revolution.

In recent decades, the letter went on, a number of worrying social indicators – such as wealth inequality and public debt – had started to climb in western nations, indicating that these societies were approaching a period of upheaval. The letter-writer would go on to predict that the turmoil in the US in 2020 would be less severe than the American civil war, but worse than the violence of the late 1960s and early 70s, when the murder rate spiked, civil rights and anti-Vietnam war protests intensified and domestic terrorists carried out thousands of bombings across the country.

The author of this stark warning was not a historian, but a biologist. For the first few decades of his career, Peter Turchin had used sophisticated maths to show how the interactions of predators and prey produce oscillations in animal populations in the wild. He had published in the journals Nature and Science and become respected in his field, but by the late 1990s he had answered all the ecological questions that interested him. He found himself drawn to history instead: could the rise and fall of human societies also be captured by a handful of variables and some differential equations?

Turchin set out to determine whether history, like physics, follows certain laws. In 2003, he published a book called Historical Dynamics, in which he discerned secular cycles in France and Russia from their origins to the end of the 18th century. That same year, he founded a new field of academic study, called cliodynamics, which seeks to discover the underlying reasons for these historical patterns, and to model them using mathematics, the way one might model changes to the planet’s climate. Seven years later, he started the field’s first official journal and co-founded a database of historical and archaeological information, which now contains data on more than 450 historical societies. The database can be used to compare societies across large stretches of time and space, as well as to make predictions about coming political instability. In 2017, Turchin founded a working group of historians, semioticians, physicists and others to help anticipate the future of human societies based on historical evidence.

Turchin’s approach to history, which uses software to find patterns in massive amounts of historical data, has only become possible recently, thanks to the growth in cheap computing power and the development of large historical datasets. This “big data” approach is now becoming increasingly popular in historical disciplines. Tim Kohler, an archaeologist at Washington State University, believes we are living through “the glory days” of his field, because scholars can pool their research findings with unprecedented ease and extract real knowledge from them. In the future, Turchin believes, historical theories will be tested against large databases, and the ones that do not fit – many of them long-cherished – will be discarded. Our understanding of the past will converge on something approaching an objective truth.

To some, the prediction that Turchin made in Nature in 2010 now seems remarkably prescient. Barring any last-minute surprises, the search engine that decodes your brainwaves won’t exist by 2020. Nor will crops that double their biomass in three hours, or an energy budget that is mostly supplied by renewables. But an imminent upheaval in the political order of the US or UK seems increasingly plausible. The Fragile States Index, calculated by the US non-profit The Fund for Peace, reveals a worsening trend toward instability in those two countries, in contrast to steady improvement in much of the rest of the world.

“We are in an age of considerable turbulence, matched only by the great age of Atlantic revolutions,” says George Lawson, who studies political conflict at the London School of Economics, referring to the period from the 1770s to the 1870s, when violent uprisings overthrew monarchies from France to the New World.

Turchin sees his prediction for 2020 not just as a test of one controversial theory. It could also be a taste of things to come: a world in which scholars generate the equivalent of extreme weather warnings for the social and political conditions of the future – along with advice on how to survive them.

As the future of mobility becomes smarter, we must treat people’s data as a public good

https://tech.newstatesman.com/guest-opinion/future-of-mobility-data-as-a-public-good

NS Tech

12th November 2019
As Minister for Transport in 1969, Richard Marsh and his contemporaries oversaw the UK’s role in a defining era for global transport. According to a recent government report, this period of change was driven by consumer demand, technological innovation, government policy and the mid-term consequences of war.

Fast forward 50 years, and transport minister George Freeman’s recent speech outlining his plan for the future of mobility showed that similar market forces and drivers of change are still applicable today. Public expectations of the transport experience continue to rise and agile regulation that’s smart and adaptable to real-world changes will need to underpin it all.

With more and more people living in urban environments, the need to manage inner city congestion, reduce carbon emissions and improve air quality is becoming more critical. Innovation in mobility is one of the great challenges of our time. Freeman laid out a vision for a connected mobility ecosystem to address these challenges – a vision that fully involves the public. It centres around two key ideas; that the next phase of mobility will be people-orientated, and that the use of public data at scale will be crucial to informing the development of it.

We can’t disagree that these ideas are critical to the effective development of the future mobility ecosystem within urban environments. However, we need to do more to ensure that future solutions adhere to principles such as the responsible use of citizen data, privacy and security by design. If we want to catalyse progress, building trust across the system and gaining the public’s buy-in is key. People should have a broader understanding of not only how their data is being used but also why it’s so crucial to a cleaner, connected and more efficient future.

Data for the people

Several of the points raised by Freeman highlight the importance of data to the success of his plan, such as setting up a network of digitally connected testbeds. Data will be an essential asset to improving the efficiency of the mobility system, whether by unlocking information that the government is already sitting on through improved information sharing or enabling new ways to gain insights from the public. For example, he referenced the strategies of the life sciences and agri-tech sectors, which collate data into hubs that allow it to be leveraged more easily at scale.

And since many innovations like autonomous cars and e-scooters require user information, mobility companies have a responsibility to collect it securely and responsibly. This balance between innovation, security and privacy is not only relevant to developers of mobility solutions, but is also of increasing importance to major supply chain participants such as telecommunication provides who have an increasingly prominent role to play in securing the future mobility ecosystem.

Transparency means trust

All this means that people are in the driving seat of the future of mobility, so we have a collective responsibility to make sure commuters are brought on the journey.

However, the responsibility doesn’t just fall on the government. Whether it’s public infrastructure bodies such as Transport for London or data security specialists, anyone benefiting from public data must keep people informed about how their data is being used for the good of the population. That’s the only way that essential public trust can be won. And that’s why at Plexal, we support the UK’s most innovative mobility startups and co-locate them with LORCA: a government-backed cybersecurity programme that’s encouraging collaboration between sectors and companies of all sizes to help secure the UK’s digital economy now and in the future.

Earning trust also hinges on creating a secure by design mentality for new mobility solutions. This term has traditionally been linked to enterprise technologies within the B2B domain. But the emergence of cyber-physical systems has allowed the public infrastructure around us to be imbued with technology and, as such, the transition of cyber risk out of the business domain and into our everyday, physical environment. It’s crucial that private enterprise helps the government apply secure design principles to new mobility systems so that security is embedded at their core, rather than bolted on as an afterthought.

As the way we move around urban centres becomes more reliant on technology for efficiency, we will increasingly become digital citizens where our data is a key currency within the digital economy. Security might become an important pillar of public infrastructure development in this future, but without the population understanding and trusting the value of data-led systems, we’ll never see the smart mobility solutions we so desperately need to help our cities cater to growing populations and evolve in an environmentally friendly way.

Saj Huq is programme director at LORCA, the London Office for Rapid Cybersecurity Advancement

Royal Navy plans to intensify use of Nelson platform

https://www.governmentcomputing.com/technology/news/royal-navy-nelson-platform

Government Computing

GC Staff Writer

National Cyber Deception Laboratory

Image: Defence Cyber School and Cranfield University launch National Cyber Deception Laboratory. Photo: courtesy of Pete Linforth from Pixabay.

The Defence Cyber School at the Defence Academy in Shrivenham has joined forces with the Cranfield University to launch the National Cyber Deception Laboratory (NCDL).

The lab is being developed by the duo to be a national focal point for cyber deception. It is expected to help the UK Ministry of Defence (MOD) improve the defence of its networks in cyberspace.

The National Cyber Deception Laboratory has a goal to bring together practitioners and researchers across government, industry, and academia to enable research and offer guidance in the context of national security.

The lab will look to come up with innovative and novel approaches to help in the development of cyber deception capabilities by linking individuals and organisations in multiple sectors.

MoD C4ISR Head and Cyber Jt User Air Commodore Tim Neal Hopes said: “We live in a period of constant contest. A period where the UK is attacked through cyberspace on a daily basis.

“Defence, if it is to maintain operational effectiveness, must therefore defend its information, networks and cyber-dependent capabilities, against these perpetual attacks.”

According to Cranfield University, cyber deception is expected to be one of the most important growth areas in cybersecurity in the coming years.

The university said that the evolution of the field within the UK military will help network defenders adopt a proactive approach. This could be by using military deception tradecraft to guard against and manipulate the activities of attackers operating within their networks in an efficient manner.

One of the approaches could involve confusing the enemy into taking steps that could expose their identity or destroying their attacks, said Cranfield University.

National Cyber Deception Laboratory director Darren Lawrence said: “Military networks need a full spectrum military defence – existing civilian security approaches are simply not up to this task.

“Deception is all about creating errors in how our adversaries make sense of their world. It is about getting them to act in ways that suit our purposes, not theirs.

“We are delighted to be working with the Defence Cyber School on this initiative. Researching ways to shape attacker behaviour and deny them the freedom to operate within our networks will enable military cyber defence to move on to a more aggressive footing and deter future attacks.”

40 more exchanges with some G.fast coverage

https://www.thinkbroadband.com/news/8589-40-more-exchanges-with-some-g-fast-coverage?utm_source=feedly&utm_medium=website&utm_campaign=newsrss

thinkbroadband

The G.fast footprint we are tracking has increased to 1,978,816 premises up from 1,903,691 on September 25th.

The forty exchanges where we have seen G.fast pods appear and accepting orders for the first time follow: The * denotes exchanges where we have more work to do to check the area for additional live pods.

  • Aberdeen Denburn *
  • Bacup *
  • Bethsada
  • Braunstone *
  • Bridlington *
  • Bucksburn
  • Chichester
  • Coalville
  • Douglas
  • Dunstable *
  • Galashiels
  • Great Alton
  • Great Harwood
  • Greenford *
  • Gowerton *
  • Harpenden
  • Hawarden *
  • Kelsall
  • Kinellan
  • North Shields
  • Norwich St Faiths
  • Old Catterick
  • Oswestry
  • Platt Bridge
  • Romsay *
  • Rothley
  • Runcorn East *
  • Sandy
  • Shaw
  • Sutton Elms
  • Tillicoultry
  • Torquay *
  • Turton
  • Warsop
  • Wilmslow *
  • Woodstock
  • Worcester St Johns
  • Worksop
  • Worthing Swandean *

The total number of live G.fast pods we know about is 8,330 and actually covers some 3,312,664 premises, with the distance from the pod dropping this to the headline 1,978,816 premises with G.fast available. The forty exchanges above don’t account for all the increase in the footprint as some existing exchanges are still seeing more pods appear.

Our broadband map has the vast majority of the G.fast areas on it, and is the in the process of being updated to include the latest ones such as Old Catterick.

One part of the G.fast roll-outs that is causing people to scratch their heads a little is that BT Consumer in particularly is limiting people to buying the 160 Mbps version, due to the amber status on the data for estimated line speeds and only once the service has gone live and real sync speeds have fed back into the system can people upgrade to the fastest 330 Mbps tier. The reasoning for doing this is most likely to manage expectations and avoid over selling the speeds that are possible, i.e. better to hit the point of sale guarantee rather than get it wrong.

Another new development is that some cabinets are seeing VDSL2 side pods appear on the green PCP cabinet and these VDSL2 sidepods are almost visually identical to the G.fast pods. So if you do see a side pod appear it may not be for G.fast but to allow for ports of VDSL2 capacity. The advantage of attaching to an existing cabinet is that you have power and fibre nearby and no need for a new plinth hole in the pavement.

Locked Up By Lock-In

https://networkingnerd.net/2019/10/18/locked-up-by-lock-in/

The Networking Nerd

Posted on October 18, 2019

When you start evaluating a solution, you are going to get a laundry list of features and functionality that you are supposed to use as criteria for selection. Some are important, like the ones that give you the feature set you need to get your job done. Others are less important for the majority of use cases. One thing tends to stand out for me though.

Since the dawn of platforms, I believe the first piece of comparison marketing has been “avoids lock-in”. You know you’ve seen it too. For those that may not be completely familiar with the term, “lock-in” describes a platform where all the components need to come from the same manufacturer or group of manufacturers in order to work properly. An example would be if a networking solution required you to purchase routers, switches, access points, and firewalls from a single vendor in order to work properly.

Chain of Fools

Lock in is the greatest asset a platform company has. The more devices they can sell you the more money they can get from you at every turn. That’s what they want. So they’re going to do everything they can to keep you in their ecosystem. That includes things like file formats and architectures that require the use of their technology or of partner technologies to operate correctly.

So, the biggest question here is “What’s wrong with that?” Note that I’m not a proponent of lock-in. Rather, I’m against the false appearance of choices. Sure, offering a platform with the promise of “no lock-in” is a great marketing tool. But how likely are you to actually follow through on that promise?

I touched on this a little bit earlier this year at Aruba Atmosphere 2019 when I talked about the promise of OpenConfig allowing hardware from different vendors to all be programmed in a similar way. The promise is grand that you’ll be able to buy an access point from Extreme and run it on an Aruba Controller while the access layer polices are programmed into Cisco switches. It’s the dream of interoperability!

More realistically, though, you’ll find that most people aren’t that concerned about lock-in. The false choice of being open to new systems generally comes down to one single thing: price. The people that I know that complain the most about vendor lock-in almost always follow it up with a complaint about pricing or licensing costs. For example:

The list could go on for three or four more pages. And the odds are good you’ve looked at one of those solutions already or you’re currently dealing with something along those lines. So, ask yourself how much pain vendor lock-in brings you aside from your checkbook?

The most common complaint, aside from price, is that the vendor solution isn’t “best of breed”. Which has always been code for “this particular piece sucks and I really wish I could use something else”. But there’s every possibility that the solution sucks because it has to integrate tightly with the rest of the platform. It’s easy to innovate when you’re the only game in town and trying to get people to buy things from you. But if you’re a piece of a larger puzzle and you’re trying to have eight other teams tell you what your software needs to do in order to work well with the platform, I think you can see where this is going.

How many times have you actually wished you could pull out a piece and sub in another one? Again, aside from just buying the cheapest thing off the shelf? Have you ever really hoped that you could sub in an Aerohive AP630 802.11ax (Wi-Fi 6) AP into your Cisco wireless network because they were first to market? Have you ever really wanted to rip out Cisco ISE from your integrated platform and try to put Aruba ClearPass in its place? Early adopters and frustrated users are some of the biggest opponents of vendor lock-in.

Those Three Words

I’m about to tell you why lock-in isn’t the demon you think it is. And I can do it in three words:

It. Just. Works.

Granted, that’s a huge stretch and we all know it. It really should be “well built software that meets all my functionality goals should just work”. But in reality, the reason why we all like to scream at lock-in when we’re writing checks for it is because the alternative to paying big bucks for making it “all just work” is for us to invest our time and effort into integrating two solutions together. And ask your nearest VAR or vendor partner how much fun that can be?

I used to spend a LOT of my time trying to get pieces and parts to integrate because schools don’t like to spend a lot of money on platforms. In Oklahoma they’re even allowed to get out of license agreements every year. Know what that means? A ton of legacy software that’s already paid for sitting around waiting to run on brand new hardware they just bought. And oh, by the way, can you make all that work together in this new solution we were sold by the Vendor of the Month Club?

And because they got the best deal or the best package, I had to spend my time and effort putting things together. So, in a way, the customer was trading money from the hardware and software to me for my wetware — the brain power needed to make it all work together. So, in a way, I was doing something even worse. I was creating my own lock-in. Not because I was building an integrated solution with one vendor’s pieces. But because I was building a solution that was custom and difficult to troubleshoot, even with proper documentation.

More Data Calls for More Bandwidth, Plus a Scalable, Redundant Network

https://www.thefastmode.com/

Josephine Bernson
Our need for more and more bandwidth will be the focus in 2020, and there are no indications that this connectivity trend will subside over the next several years. After all, we’ve come to expect that data will be available at our fingertips, across all industries as well as throughout our everyday lives.

The IoT and potentially billions of connected smart devices are constantly sending in data. The rollout of 5G networking is just beginning. Industries are starting to implement solutions that leverage artificial intelligence and machine learning. The need for businesses to move gigabytes to terabytes or even petabytes of data at a fast rate will continue. There is no end in sight. The following connectivity needs and trends all tie back to the reality of bigger data and the corresponding demand for bandwidth.

#1: More Bandwidth Requires Latency, Scalability and Redundancy

The fundamental requirement to move higher and higher amounts of data through fast, reliable networking touches every public and private sector – the government, financial services, healthcare, education, nonprofits, private enterprises and more. More efficient routing and transport of data, voice and video and lower network latency will be important for quicker uploading and downloading of these large data files. To keep pace with increasing demand, network providers must be ready to scale up quickly. Network reliability and redundancy cannot be stressed enough.

#2: The Cloud and SD-WAN

The Cloud and SD-WAN are redefining bandwidth requirements. Today, most businesses, institutions and government agencies are considering migrating data to the cloud, whether it’s an on-premise cloud environment, a hybrid mode or a public cloud. They’re either thinking about it, talking about it, in the process of transitioning or have already made the move. SD-WAN or software-defined wide-area networks will simplify connecting enterprise networks spread over large geographic distances and facilitate the availability of the fast trove of enterprise data.

#3: Technology Connects the Government, Healthcare and Education

Connecting government offices over one network and enabling them with the fast and reliable bandwidth they need to ensure communication and sharing of information is a must for first responders, law enforcement, hospitals, schools, universities, courthouses and other government offices.

Healthcare providers will expand their use of telehealth. Video consultations provide the ability to reach patients in remote areas without the need for patients to travel. For example, remote monitoring is a way for patients to reduce the number of clinic visits to have a pacemaker checked. Patient medical information can be shared between hospitals and clinics for a more rapid diagnosis. Telemedicine will require more and more bandwidth and equipment as processes become more sophisticated.

Online learning is a reality that has transformed education. Students can learn in their own timeframe, from anywhere, by utilizing the Internet. Classes are largely video, with students uploading assignments for grades through cloud-based education programs. Online learning requires large amounts of data and high levels of network reliability and redundancy. After all, when assignments are due, the sites can’t go down!

#4: The Economic Boost: Urban To Rural and Rural To Urban

The ability to have high-speed Internet and other communications services in smaller communities is an economic boon. Potentially lower property costs encourage larger companies to move to a rural area, which in turn can help revitalize a community’s economic development prospects. Jobs can be added, housing grows, and in turn, there is opportunity for more businesses to follow suit. Thanks to bandwidth technology, talented entrepreneurs can live in small communities and build a thriving online business. Bandwidth is a way to keep smaller communities alive.

#5: Bandwidth Hogs, video Streaming Services

Another high bandwidth trend is the ongoing expansion of video as an industry. Video is key for home entertainment, business communication, education, and more. YouTube is widely viewed as the second most popular social media platform and TikTok is another social media video app that’s gaining in popularity. Video requires a large amount of bandwidth and as video continues to grow in quality, popularity and varied uses, the need for higher amounts of bandwidth to transport becomes a reality.

#6: Big Data

Higher bandwidth enables so-called “Big Data” analytics, where massive volumes of both structured and unstructured data can be analyzed to extract information and uncover patterns, trends and associations. From discovering customer behavior and buying patterns to advancing cancer research and tailoring treatment strategies to a patient’s unique genetic makeup, big data analytics tools can extract actionable intelligence that was previously unattainable.

These noteworthy technology trends link to the higher and higher volume of data being consumed, and that will continue to grow every day. What does the future hold? An ongoing, ever-increasing need for bandwidth capable of handling all of this data.

UK security chiefs reportedly warn PM against banning Huawei

https://telecoms.com/500829/uk-security-chiefs-reportedly-warn-pm-against-banning-huawei/

Telecoms.com

The UK is much less likely to hit ambitious broadband targets if Huawei isn’t involved in the 5G network, Johnson has apparently been told.

The report comes courtesy of the Mail, which reckons the National Security Council has been warned that Prime Minister Johnson can forget hitting his 2025 fast broadband targets if the UK’s 5G network isn’t up to scratch. This would apparently be made inevitable if we didn’t let Huawei participate, the NSC has apparently conveyed to the PM.

When BoJo first announced his ambition broadband targets they seemed to be focused entirely on fibre, but it looks like he has been subsequently advised that there’s this thing called fixed wireless access that is really handy for connecting remote locations. His patient telecoms advisors will presumably have then pointed out that the kind of FWA needed to help him hit his targets would require a decent 5G network, hence the Huawei angle.

This leak to the Mail, presumably from somewhere in the government, seems designed to serve two purposes. The minor purpose is to rebrand BoJo’s pledge as being focused on outcomes rather than technology types. But the main reason for it is probably to provide cover for the decision not to block Huawei from the 5G network, which seems to have been made already but won’t be announced until after the general election.

US President Trump is a Johnson supporter, but that relationship will be strained if BoJo decides not to tow the US line on Huawei. Maybe BoJo is hoping to use his own broadband pledge as mitigation during the inevitable awkward conversation he will have with Trump after announcing he’s not blocking Huawei from the UK 5G network.

The Benefits of Refreshing Router-Centric WANs with SD-WAN

https://www.networkworld.com/article/3452872/the-benefits-of-refreshing-router-centric-wans-with-sd-wan.html

Network World

Nirav Shah | Nov 11, 2019 6:37 am PST
Weaving automated, broad and powerful security into a seamless security fabric is the foundation to securing digital business.

iStock

The advantages of SaaS applications and other cloud services has businesses rethinking their traditional router-centric WAN strategy. That’s because many of today’s business-critical applications carry the twin challenges of needing high performance, especially for latency-sensitive applications such as unified communications, combined with high volumes of data. These requirements can quickly swamp traditional WAN connections that backhaul data and transactions through the data center. Without the ability to connect directly to the internet, application speeds slow and performance suffers.

The other challenge is that routers generally only view data at the packet level, with little to no intelligent recognition or prioritization of business applications. As a result, mission-critical SaaS applications must not only compete for bandwidth with other business data, but also with non-essential traffic such as YouTube videos or Spotify streams. Without the ability to recognize, prioritize, and steer connections to business-critical SaaS applications, it’s all just data going in and out of the branch routers. The result is lowered application functionality, user experience, and business results.

Transitioning from WAN Routers to SD-WAN

According to a report by IHS Markit for Q2 2019, SD-WAN revenues were up 23% over Q1 as corporations continue to accelerate the replacement of their installed WAN routers with SD-WAN appliances. This growing trend is why a recent IDC report predicts that the SD-WAN market is likely to reach $4.5 billion by 2022, growing at an astounding rate of 40% year over year – an unusually high growth rate even for the high tech industry.

These changes are being driven by the need to make data and other online resources available in real time to even the most remote workers happening at the same time as the aging installed base of WAN routers needs to be refreshed. As enterprises enter this router replacement phase, many are taking the opportunity to rethink their WAN strategy by upgrading to SD-WAN compatible hardware.

However, some organization still struggle toovercome their preconception that routers and dedicated MPLS connections are the only option for reliable WAN connectivity. That is simply no longer the case. SD-WAN offers a faster route to an efficient, global enterprise network. It not only allows direct and secure connections to cloud-based applications and services over the public internet and direct interconnectivity between branch offices – both without backhauling all traffic through a central hub, but it can still support secure MPLS connections back to the core data center when necessary.

This sort of connection flexibility is crucial. As organizations migrate to increasingly complex hybrid cloud data architectures, they quickly discover that new cloud-based SaaS applications used at the branch that have to travel over traditional, router-centric technologies end up with serious performance and functionality challenges, which can significantly reduce efficiency, productivity, and user experience. They need access to cloud services through direct connections over public networks. And at the same time, many still prefer fast, reliable, and secure MPLS connections between the branch and data center. SD-WAN ensures that connectivity is not an either/or proposition.

Choosing the Right SD-WAN Solution

Of course, like any burgeoning opportunity, new vendors have been flocking to the SD-WAN market, flooding it with a wide range of competing solutions. This is often the first hurdle organizations face when considering a transition strategy. Wading through this complexity begins by first fully understanding your branch connectivity needs and then mapping solutions against those requirements.

One of the most overlooked issues for organizations adopting SD-WAN technologies is the need to provide the same level of security for their direct internet connection that was available when such connections were routed through the central data center. Most SD-WAN solutions fall short in this area, providing at most a simple firewall to protect connections over public networks. To make up the difference, organizations are forced to design, deploy, and manage an overlay security solution that adds layers of management complexity and overhead that often undermines the savings that SD-WAN was supposed to provide.

In addition, when security and connectivity are not fully integrated, organizations can experience serious lapses in visibility and control because security has to continually react to connectivity changes, creating lag times that leave gaps in protection and impact performance.

Instead, any viable SD-WAN candidate should natively provide a consistent security posture through the availability of a full stack of integrated security functions, including NGFW, IPS, anti-virus and anti-malware, web filtering, encryption, and sandboxing. CASB services should also be implemented to protect SaaS applications and prevent Shadow IT-related challenges. And because SD-WAN traffic traveling over public networks needs to be encrypted, those security tools also need to be able to decrypt, inspect, and re-encrypt data at business application speeds. Unfortunately, NGFW solutions – and not just those included with most SD-WAN appliances – tend to struggle to keep up with encrypted data inspection requirements.

Other essential SD-WAN functions include providing the same traffic and connectivity management services provided by traditional routers. However, SD-WAN solutions also need to be able to prioritize business-critical applications using advanced application recognition and steering. This ensures that applications receive the proper bandwidth and priority from the first packet, as much as tripling application performance.

And because few branches have their own IT staff, zero-touch deployment is also crucial. It enables faster branch roll outs by reducing deployment from days or weeks to hours or minutes. A unified console that can manage both network and security operations can go a long way towards controlling IT overhead. Organizations should also consider SD-WAN tools that can be easily expanded to secure local branch LANs as well, solving two issues with a single solution.

Realizing the Benefits of SD-WAN

At the end of the day, when an organization invests in digital infrastructure they are not looking for theoretical gains. They are looking for a clear return on investment (ROI), increased efficiencies, and greater productivity. It can therefore come as quite a shock when the technology they have invested in is not able to perform to expected specifications.

But through careful planning, leveraging third-party resources – like the NSS Labs SD-WAN Test Report or the Gartner Magic Quadrant for WAN Edge Infrastructure, and clearly defining your organization’s unique requirements, you can make the transition from static MPLS and router-centric WAN connections to the flexible and scalable benefits of a secure SD-WAN solution strategy.