Aruba Helps you Protect your Business from Edge to Cloud

Aruba Helps you Protect your Business from Edge to Cloud

Aruba Helps you Protect your Business from Edge to Cloud

Enhance your organisational security with ClearPass and EdgeConnect.

The Australian Cyber Security Centre (ACSC) received over 67,500 cybercrime complaints last financial year, a 13 percent increase from the previous year. Cybercrime is becoming increasingly prevalent, and attacks are becoming more intricate and sophisticated.

The pandemic rendered more Australians dependent on the internet for remote employment, access to services and information, communication, and connection with others. This dependency has expanded the attack surface and presented cybercriminals with more target opportunities. However, many firms have neglected to establish secure network access control (NAC).

The ACSC recommends businesses examine their underlying networks to identify vulnerabilities and apply appropriate cybersecurity measures. With cyber-actors scrutinising security risk reports and using automated tools to check for and exploit network opportunities, network security should remain a high organisational priority.

There are a range of technology solutions available to businesses seeking to protect their networks and high value data. Aruba ClearPass and EdgeConnect work to provide proven security solutions that combat evolving risks in business networks.


Aruba ClearPass, according to Aruba’s Chief Technology Officer, supports businesses in concentrating on security by adjusting to an ever-changing, diverse group of users and devices. This is done in order to defend against attacks that are both smart and persistent.

ClearPass is a secure Network Access Control (NAC) solution that combines a comprehensive policy manager with secure BYOD device onboarding, posture testing before permitting secure network connections, and a streamlined guest self-service interface.

ClearPass is based on the Zero Trust concept – no device, person, or network segment is inherently trustworthy and should be seen as a potential risk. ClearPass combines discovery and profiling to ensure that users and devices are granted the appropriate access privileges, independent of access mode or device owner.

ClearPass provides businesses with assistance in adopting a Zero Trust posture so they can avoid the dangers associated with the notable increase in attacks directed at Internet of Things devices and network users.

According to Aruba, this method can also be of use to businesses who are looking to implement a secure access service edge (SASE) strategy in order to secure their operations.

ClearPass Device Insight enhances critical discovery and profiling capabilities, enabling the identification of a wide range of IoT and mobile devices in varied circumstances. ClearPass employs Deep Packet Inspection (DPI), robust machine learning algorithms, and crowdsourcing device fingerprinting to accomplish this.

Aruba ClearPass can be incorporated into any network to protect guest WiFi, corporate WiFi, VPN connections, wired network ports for corporate-issued PCs, laptops, tablets, and BYOD and IoT devices.

ClearPass is able to further reduce the complexity of security while simultaneously expanding the number of enforcement operations by interfacing with an ecosystem that has more than 150 different third-party solutions covering a wide range of security capabilities.


Many enterprise applications are migrating from corporate data centres to the cloud. Private line connections such as multiprotocol label switching (MPLS) are often restrictive and expensive.

Aruba EdgeConnect, a software-defined wide-area network (SD-WAN) platform, enables organisations to substantially cut the cost and complexity of constructing a WAN by leveraging the Internet to link users to applications.

By allowing businesses to use internet connections to augment or replace their existing MPLS networks, Aruba enhances customer responsiveness, increases application performance, and significantly reduces capital and operational expenses.

There are many benefits offered to businesses by SD-WAN. By implementing SD-WAN in their networks, businesses may achieve greater agility while also reducing cost.

When it comes to building a good SASE architecture, SD-WAN is the foundational component. By choosing the most powerful security features available and integrating them with SD-WAN, organisations can ensure the greatest level of security for their workers and other stakeholders accessing the network.

SD-WAN enables enterprises to benefit from the versatility of cloud computing while also improving the effectiveness of cloud applications. This is performed by using local internet breakout to route traffic straight to the cloud.

SD-WAN enables enterprises to shift to a thin-branch model by simplifying network architecture, reducing the amount of equipment required at branch sites, and significantly reducing administrative labor for WAN administration.

New branch offices can be established quickly and easily using SD-WAN, and changes to a company’s security policy can be automatically transmitted to hundreds or thousands of branches in minutes, reducing potential for error. Dashboards and a single pane of glass enable network administrators to monitor network status.

Aruba ClearPass integration with EdgeConnect Enterprise provides fine-grained segmentation by enhancing application intelligence with the user and device identification and role-based policy. The additional identity-based context accelerates troubleshooting and issue resolution and allows for consistent security policy enforcement throughout the whole network, from the edge to the cloud.

ClearPass becomes even more powerful when combined with EdgeConnect. EdgeConnect Business Intent Overlays (BIO) enable companies to create a virtual WAN for every traffic class. Application performance, security, and routing policies are automatically programmed to all sites, ensuring consistency across the network.

Protect your business from edge to cloud

Aruba ensures interoperability between their infrastructure and other partner solutions, resulting in simplified installation and operation. Aruba’s partnership-certified solutions aid organisations in adopting digital transformation and increase the value of installed infrastructure.

For example, Dropbox, which manages file transfers for up to 700 million customers at any given time, employs Aruba’s edge to cloud security solutions. ClearPass, a key component of the Aruba solution, serves as Dropbox’s principal certificate server. According to Dropbox, employing ClearPass offers a safe and easy experience that allows their large team to easily connect to and access networks that they have been granted access to.

Another example is The Australian National Audit Office (ANAO), who have minimised the possibility of data breaches by adopting ClearPass and AirWave to provide certifications to the devices of 350 members of its highly mobile staff. As a result, their clients trust the ANAO and are prepared to provide auditors with the information required for report creation.

Aruba & Palo Alto Networks

Incorporating the necessary levels of security and control into a mobile-first design is a challenge to be solved as the trend toward more flexible and productive networks gains momentum. As businesses see an influx of headless and IoT devices that connect to the general IT infrastructure, new attack vectors are introduced.

Aruba ClearPass is a proven network access control and policy management solution that acts as a network gatekeeper, ensuring secure network access and quick attack response. The Palo Alto Networks Security Operating Platform prevents intrusions with intelligent automation. Aruba and Palo Alto Networks collaborate to provide powerful integrated features. Aruba ClearPass Secure NAC employs Next-Generation Firewall policies and rules to detect minor to significant changes in user or device behaviour, indicative of insider assaults.

This integrated solution provides businesses with enhanced visibility into IoT and corporate network devices, establishing firewall rules and restricting application access based on user identity and device security posture.

Through joint efforts, Aruba ClearPass and Palo Alto Networks encourage companies to adopt the latest Zero Trust strategy. The Zero Trust concept was developed in response to the security challenges inherent with modern apps and networks. In Zero Trust solutions, the following essential ideas are used:

  • The least privileged level of access. Regardless of the network configurations, such as IP addresses and port numbers, it enables precise access control at the application and sub-application levels.
  • Constant trust validation. Following the authorisation of application access, a continuing trust evaluation is carried out based on changes in the posture of the device, the behaviour of the user, and the activity of the application.
  • Continuous security checks. Extensive and continuous analysis of all application traffic, including approved connections, is used to help prevent assaults, particularly zero-day threats.
  • Data security. Using a single data loss prevention (DLP) policy, this feature provides consistent data management across all applications, including both private and SaaS applications.
  • Application security. Maintains the highest level of security for all corporate systems, including newly developed cloud-native, legacy private, and SaaS applications.


Workloads are progressively distributed over public, private, and hybrid clouds, with more end-users working remotely and accessing mission-critical applications and resources via personal computers, laptops, and mobile devices. Businesses can upgrade their networks and achieve new levels of agility and efficiency in connectivity with software-defined networking (SDN) and SD-WAN.

Aruba Networks and Megaport work together to simplify and accelerate network connectivity whilst extending the benefits of SD-WAN and SASE. Megaport Virtual Edge (MVE) easily connects with Aruba SD-WAN to provide businesses with a thoroughly modernised and transformed network architecture designed for the cloud. The joint solution simplifies SASE and enables the rapid delivery of responsive, flexible, and secure networking services in a matter of clicks.

MVE keeps applications and traffic on the Aruba SD-WAN fabric for a longer time, thereby reducing the risk and unpredictability of using the public Internet while simultaneously providing the immediate benefits of higher performance, increased security, and improved security efficiency.


True security can be achieved if the network is viewed and managed centrally, ensuring only authenticated or authorised devices can connect. ClearPass enables the development of complex access controls based on predictability, based on factors such as user and device types, device management data, certificate status, location, and day of the week.

With EdgeConnect and ClearPass’s possibilities to modernise and protect your organisation, your IT partner can help you develop value-added solutions such as SD-WAN-based security efficiency. Contact us today to discuss how we can help you move towards a more intelligent and progressive business, together.

Workload Churn and Balancing IT Environment Choices

Workload Churn and Balancing IT Environment Choices

Workload Churn and Balancing IT Environment Choices

Workload churn activity is an indication of the evolution of IT operations. While the public cloud delivers significant benefits and flexibility, the future of IT is hybrid.

The public cloud delivers significant benefits and flexibility, while on-premise environments, in many cases, deliver superior computing performance, data storage, data movement, and disaster recovery plans, as well as security and regulatory compliance requirements.

One of HPE’s leading workload hosting solutions is GreenLake, which combines the best of public and private cloud and delivers an elastic as-a-service platform that can run on-premise, at the edge, or in a co-location facility. HPE GreenLake integrates the simplicity and agility of the cloud with the governance, compliance, and visibility that comes with hybrid IT.

What is workload churn, and how common is it? 

Workload churn (also referred to as workload repatriation) is the process of migrating workloads from the public cloud to on-premise or dedicated off-premise environments. Workload churn is not a new concept. It began when the public cloud became a viable alternative to traditional datacentre operations. The value proposition of the public cloud was its ability to offload a datacentre and IT infrastructure management into a service provider’s datacentre. Rather than requiring substantial investment in IT infrastructure, public cloud aligned the operating expenses with service usage.

However, organisations quickly realised the limitations of the public cloud and that it couldn’t fully replace corporate IT operations. A “backward” migration occurred, involving workloads that either were moved into or started in the public cloud being moved into dedicated environments. IDC research shows that such repatriation activity happens almost as often as workload migration into the public cloud. In IDC’s February 2021 Servers and Storage Workload Survey, about three-quarters of respondents who run workloads in the public cloud indicated that they plan to move some of their workloads, partially or fully, into the dedicated cloud or non-cloud environments.

Does workload churn signal limited use of public cloud by enterprise IT? 

Workload churn activity is an indication of the evolution of IT operations. While the public cloud delivers significant benefits and flexibility, the future of IT is hybrid. The use of both dedicated and shared infrastructure by a single organisation and the movement of data and applications between various clouds will be common occurrences. Operations will be defined by each organisation and will depend on several factors, including the need for compute performance, data storage, data movement, and disaster recovery plans, as well as regulatory compliance.

The interoperability of dedicated and public clouds will be a major factor as well. Isolated islands of IT systems each dedicated to a particular workload often create inefficiencies in IT operations. Isolated clouds are also inefficient by modern IT standards. Finding the right balance between dedicated (private) and public cloud usage is more a continuous process for each organisation than a final state.

According to the Cloud Pulse Survey, there is increasing interest among businesses to expand the usage of dedicated cloud solutions. This trend is partially driven by the movement from non-cloud operations to cloud-based operations. The increased use of dedicated clouds is related, at least in part, to workload churn activity.

What are the major reasons for workload churn activities? 

Data security remains one of the biggest contributors to enterprises moving workloads off the public cloud into protected dedicated environments. Despite significant investments into improved security by cloud service providers, public cloud environments remain an attractive target for cybercriminals. In IDC’s 2021 Servers and Storage Workload Survey, 43% of respondents identified data security concerns as the major reason to engage in workload repatriation activity.

Another common and related concern is data privacy. While closely related to data security, data privacy has its nuances related to the exposure of private data to entities that shouldn’t have access to it. In the previously mentioned IDC workloads survey, 36% of respondents identified data privacy as the second most prevailing reason for moving workloads from public cloud to dedicated environments. Performance and bandwidth bottlenecks, the unpredictability of public cloud service pricing, and IT consolidation efforts are among the top reasons for workload churn. These reasons for moving workloads were cited by 15 – 25% of survey respondents.

Are there any differences in triggers for workload churn? 

While data security and privacy are commonly shared concerns across all workloads, workload profiles play a role in triggering repatriation activities. For example, collaborative, ERM, and CRM applications have a higher rate of private information shared between users, so data security concerns are strong factors when deciding to move at least part of the data into protected dedicated IT environments.

For other workloads, such as data management, the need for better performance plays a bigger role in determining workload repatriation. Bandwidth concerns impact a broad range of workloads that need a continuous movement of data, such as networking, security, technical, and data analytics workloads. The unpredictability of usage-based pricing often causes organisations to move VDI, application development, and collaboration workloads into dedicated environments with more predictable pricing.

What can enterprises do to evaluate optimal IT environments for the placement of workloads? 

As IDC’s Cloud Pulse Survey shows, enterprises are departing from the “public cloud only” paradigm and moving toward “public cloud first” and “public cloud also” approaches to planning their IT operations. This shift embraces the reality that hybrid cloud and multi-cloud approaches are more likely to serve organisational IT needs.

What became evident in the past few years is a move toward services-oriented IT. A variety of recent solutions give enterprise users the ability to achieve the cloud experience on dedicated infrastructure. The availability of such solutions helps with solving the dilemma of workload placement to a certain point.

Enterprises need to thoroughly evaluate the current and future needs of their workloads. What will serve their performance, storage, and data movement requirements better — on-premise solutions or public cloud services? Do the workloads require optimised infrastructure? What would be the migration path should the organisation decide to move to a public cloud? What costs are associated with different options? These are some of the questions important to answer to define the best way to serve the needs of workloads.

HPE GreenLake 

By deploying HPE Greenlake for their on-prem environments, businesses can continue to have a cloud experience including detailed reporting and resource management through a central console and the elasticity to grow without needing to immediately buy (and wait for more hardware).

Today, thousands of businesses are using HPE GreenLake across 50 countries in all industry sectors and sizes, including Fortune 500 companies, government and public sector organisations, and emerging enterprises.

HPE GreenLake offers a range of cloud services that accelerate innovation, including cloud services for computing, container management, data protection, HPC, machine learning operations, networking, SAP HANA, storage, VDI, bare metal, and VMs.

In March 2022, HPE made significant advancements to GreenLake by introducing a unified operating experience, new cloud services, and the availability of solutions in the online marketplaces of several leading distributors.

Multi-cloud experiences 

HPE GreenLake now supports multi-cloud experiences everywhere – including clouds that live on-premise, at the edge, in a co-location facility, and in the public cloud. HPE GreenLake continues to evolve and provide businesses with one easy-to-use platform to transform and modernise their organisation. The HPE GreenLake platform now provides the foundation for more than 50 cloud services, including electronic health records, ML Ops, payments, unified analytics, and SAP HANA, as well as a wide array of cloud services from partners.

Recent platform updates include a convergence with Aruba Central, a cloud-native, AI-powered network management solution. GreenLake has also added a unified operational experience that provides a simplified view and access to all cloud services, spanning the entire HPE portfolio, with single sign-on access, security, compliance, elasticity, and data protection.

HPE GreenLake for Aruba networking 

Delivering comprehensive edge connectivity networking solutions, HPE is building out its network as a service (NaaS) offerings with HPE GreenLake for Aruba networking. The new services simplify the process of procuring and deploying NaaS and allow customers to align network spending to usage needs while ensuring that the network is always ready to support business objectives.

The new services are built to satisfy growing demand for NaaS and the ability to operate in either a ‘traditional’ or managed service provider (MSP) model. Covering a full span of business use cases – including wired, wireless, and SD-Branch – the new services provide increased levels of velocity and flexibility, accelerating business time to revenue.

HPE GreenLake for Block Storage 

The industry’s first block storage as-a-service to deliver 100% data availability, HPE Greenlake for Block Storage guarantees built-in on a cloud operational model. It helps businesses transform faster and brings self-service agility to critical enterprise applications. The new offering delivers the following capabilities:

        • Self-service provisioning to provide line of business owners and database admins the agility required to build and deploy new apps, services, and projects faster
        • IT resources are freed to work on strategic, higher-value initiatives with 98% operational time savings

HPE Backup and Recovery Service

HPE has harnessed back up as a service with their built for hybrid cloud offering. Businesses can effortlessly protect their data for Virtual Machines, gain rapid recovery on-premise, and deliver a cost-effective approach to storing long-term backups in the public cloud. HPE Backup and Recovery Service is now available for Virtual Machines deployed on a heterogeneous infrastructure. HPE is advancing its ransomware recovery solutions by adding immutable data copies – on-premise or on Amazon Web Services (AWS) with HPE Backup and Recovery Service.

HPE GreenLake for High-Performance Computing 

HPE is further enhancing its HPE GreenLake for High-Performance Computing, making it more accessible for any enterprise to adopt the technology by adding new, purpose-built HPC capabilities. The new capabilities quickly tackle the most demanding compute and data-intensive workloads, power AI and ML initiatives, and accelerate time to insight. These also include lower entry points to HPC, with a smaller configuration of 10 nodes, to test workloads and scale as needed. New features and capabilities include:

        • Expanded GPU capabilities that will integrate with HPE’s Apollo 6500 Gen10 system to accelerate compute and advance data-intensive projects using the NVIDIA A100, A40, and A30 Tensor Core GPUs in increments of 2-4-8 accelerators. The new service will feature the NVIDIA NVLink for a seamless, high-speed connection between GPUs to work together as a single robust accelerator.
        • HPE Slingshot, the world’s only high-performance Ethernet fabric designed for HPC and AI solutions, delivers high-performance networking to address demands for higher speed and congestion control in larger data-intensive and AI workloads.
        • HPE Parallel File System Storage, a scalable, high-performance storage solution to deliver advanced throughput for broader HPC and AI needs.
        • Multi-cloud connector APIs can programmatically orchestrate HPC workflows on a diverse pool of computing resources, such as other HPE GreenLake for HPC or public clouds. The new model delivers more elasticity, scalability, and tools to optimise the usage of disaggregated resources. The capability improves collaboration by connecting to other projects between multiple sites and removing silos.



Developers and IT operations can swiftly scale new apps and capabilities as the public cloud creates new possibilities in speed and agility, greatly simplifying IT. Despite the advantages of public cloud computing, due to data gravity, latency, application reliability, and regulatory compliance, many applications and data should be kept in data centres and co-location facilities.

HPE GreenLake’s as-a-service architecture blends the agility and affordability of the public cloud with the security and performance of on-premise IT to deliver on-demand capacity. With GreenLake services, we can help you accelerate your digital transformation by using cloud benefits like quick deployment, scalability, and pay-per-use economics while maintaining control over your on-premise environment. Contact us to discuss the possibilities of HPE Greenlake for your business.

What’s Cybersecurity Insurance, and Does My Business Need It?

What’s Cybersecurity Insurance, and Does My Business Need It?

What’s Cybersecurity Insurance, and Does My Business Need It?

At their core, cybercriminals are opportunists. Give them the chance to test your vulnerabilities, be it your network or human emotion, and they’ll exploit you financially at every possible turn. Make no mistake: if you run a business, you’re seen as a money maker. No matter the size.

Don’t assume your standard errors and omissions (E&O), property, or general liability policies will help you if you get in a bind either. Most won’t cover cyber damages or electronic record breaches.

For that, you need something called cyber liability insurance. This primer will help you get acquainted and protect your sensitive data, as well as your bank account.

Cybersecurity Insurance, Defined

This insurance product specifically covers cyber incidents, insulating businesses from the financial losses and costs associated with them. Most cover an array of common cybercrimes including, data breaches, electronic record theft, ransomware attacks, denial of service events, and other forms of hacking.

It’s important to note that, “Cybersecurity policies can change from one month to the next, given the dynamic and fluctuating nature of the associated cyber-risks. Unlike well-established insurance plans, underwriters of cybersecurity insurance policies have limited data to formulate risk models to determine insurance policy coverages, rates, and premiums,” explains Tech Target.

How Do You Know If You Need Cybersecurity Insurance?

While no business is immune to cyberattacks, some are more likely targets than others.

Look into cyber insurance if you fall into one of these three categories:

1. You rely on electronic data to run your businesses.

Anything stored online or on a computer can be hacked. Period. Even if you make every possible effort to secure your data, the risk of a breach is never zero. If you think about all the types of information most businesses work with – social security numbers, credit card info, company financial data – it’s easy to see why they’d be of value to a threat actor. Cyber policies can help cover ransomware payments and legal fees should your data get stolen.

2. You have a sizable customer reach.

The bigger your customer base, the more money a hacker can potentially make from selling those records. It’s also a PR and regulatory nightmare to alert anyone you’ve ever done business with about a data breach you’ve incurred. With the right cyber insurance, you can insulate your business from some associated cleanup costs and regulatory fines put in place to protect consumers.

Profits alone can be like a beacon for nefarious cyber actors.

3. You’re profitable or have business assets.

So you’re turning a profit in your business. Great! That alone can be like a beacon for nefarious cyber actors. They may view it as a green light to drive up the price of your ransomware payment or simply disrupt your hard-earned revenue. And even though it’s impossible to know the total cost of a breach ahead of time, it’s likely the premium on your cyber policy will pale in comparison.

Where Do You Go For Cybersecurity Insurance Coverage?

Several reputable insurance providers now offer cybersecurity coverage on their insurance plans. You’ll typically be creating a new policy, separate from any you already have with a carrier. However, some providers do offer cybersecurity endorsements that can be packed with other policies.

If you’re curious whether your current insurance provider offers cyber coverage, contact your broker or find a cybersecurity insurance specialist.

And remember, since the threat landscape is constantly shifting, simply instating a cyber policy is likely not enough to keep your business totally protected. Insurance is focused on recovery, but you as an employer still have to take the reins on prevention.

3 Minute Read: Legacy System Modernisation with Greenlake

3 Minute Read: Legacy System Modernisation with Greenlake

3 Minute Read: Legacy System Modernisation with Greenlake

Applications are central to almost every business process. Applications need to be flexible, reliable, and secure to be effective and profitable. Improving the underlying infrastructure helps ensure that these functions (including servers, mainframes, and data storage or cloud) are continually available.

Application modernisation necessitates organisational changes, such as shifting some resources from on-premise deployments to cloud deployments. Often, legacy infrastructure may be perceived as being too old and too difficult to modernise, especially where that infrastructure sits in ERP (enterprise resource planning). ERP is at the heart of business operations and complex modernisation projects can have the potential to disrupt business processes.

There are several approaches to modernisation depending on what your business requirements are.

Pathways to updated infrastructure

System modernisation is usually initiated when opportunities for improvement are identified. The main drivers often involve business fit, value and agility, cost savings, reduced complexity and risk aversion. Where legacy infrastructure becomes too costly and complex to maintain, or puts the company’s security at risk, these are common reasons for businesses to update.

Once the opportunity for improvement has been identified, businesses can then evaluate modernisation options. According to Gartner, there are seven options that have been ranked by ease of implementation:

  1. Encapsulate – Making applications available as services via an API.
  2. Rehost – Redeployment of the application component to other physical, virtual or cloud infrastructure without further modification.
  3. Replatform – Migration to a new runtime platform with minimal changes to the code (not the code structure, features or functions).
  4. Refactor – Restructure and optimisation of the existing code, removing code debt and improving unbeneficial attributes.
  5. Rearchitect – Altering the code to shift it to a new application architecture.
  6. Rebuild – Completely redesigning the application component while preserving its specifications.
  7. Replace – Entirely removing the former application component and replacing it while considering new needs and requirements.

An easier implementation poses less risk and impact on business processes, while harder implementations pose more risk and impact.

Choosing the best fit modernisation strategy will depend on factors such as the organisation’s existing technology, architecture, technical capabilities and risk appetite.

At its core, modernisation means either rearchitecting, rebuilding or replacing. To successfully complete any of these, businesses need the right underlying IT model to ensure business operations continue with minimal disruption.

HPE Greenlake

Regardless of which strategy you choose, HPE Greenlake is a consumption-based IT model which can support your transformation. By providing a true hybrid cloud experience, it enables IT teams to manage their on-premise and cloud-based systems and data from one intuitive portal.

HPE GreenLake even allows the organisation to choose specific software version levels in the environment and when (or if) they are updated. This is crucial for organisations with infrastructure that may be several firmware versions behind.

HPE GreenLake brings the modern cloud experience to legacy apps, data and workloads in your locations with self-serve, pay-per-use, scalability while being managed as a service by HPE and its partners.

A digital transformation accelerator, HPE GreenLake is the one platform that brings the cloud experience to apps and data where they live. It transforms traditional, non-cloud native apps with an open, container-first approach and enables businesses to create their own AI/ML competency in-house.


Tech refresh, upgrades, transformational projects or legacy system modernisation are all opportunities to have a conversation about moving to the on-premises cloud experience of HPE GreenLake. For current HPE compute users, bringing HPE legacy server estates under HPE GreenLake control is one way to get started.

Whether you are a current HPE user or not, as your IT partner, we can help you identify the best modernisation approach for your business needs and implement the right HPE solutions for your business goals.

Meraki Virtual MX Appliances for Public and Private Clouds

Meraki Virtual MX Appliances for Public and Private Clouds

Meraki Virtual MX Appliances for Public and Private Clouds

Virtual MX (vMX) is a virtual instance of a Meraki security and SD-WAN appliance dedicated specifically to providing the simple configuration benefits of site-to-site Auto VPN for organisations running or migrating IT services to public or private cloud environments. An Auto VPN tunnel to a vMX is like having a direct Ethernet connection to a private data centre.

Features and Functionality of the vMX Appliance

Features and Functionality of the vMX Appliance

vMX functions like a VPN concentrator and includes SD-WAN functionality like other MX devices. For public cloud environments, a vMX is added via the respective public cloud marketplace and, for private cloud environments, a vMX can be spun up on a Cisco UCS running NFVIS. Setup and management in the Meraki dashboard, just like any other MX.

vMX – Small

vMX – Medium

vMX – Large

Recommended use cases

Extend secure SD-WAN connectivity from branch sites to resources in public and private cloud environments





Supported Cloud Platforms

Google Cloud Platform 1
Alibaba Cloud

Google Cloud Platform 1
Alibaba Cloud

Google Cloud Platform1
Alibaba Cloud

Maximum site-to-site VPN Throughput

200 Mbps

500 Mbps

1 Gbps

Maximum Concurrent site-to-site VPN Tunnels




Client VPN Support




Experience our technology.

Build your network on the platform designed for how people work.

1Targeted 2HCY 2021

Service Economy: New IoT and Edge Opportunities

Service Economy: New IoT and Edge Opportunities

We live in the age of “connected intelligence.” Computing technology has had a significant impact on global organisations and how people communicate and interact with information.

Edge computing has evolved as a computational model that uses the resources available at the edge of connected environments. Edge computing does not replace cloud computing; it augments it by distributing workloads in scenarios where traditional cloud architecture is ineffective. Multi-player gaming, augmented reality/virtual reality, autonomous vehicles, connected manufacturing floors, robots, and video processing are emerging use cases of edge computing. Central to the solutions chosen by organisations is the manner in which the data is stored, moved and accessed.

The proliferation of IoT has resulted in an explosion of data, forcing organisations to rely more heavily on computing and storage solutions. However, when migrating the entire IT infrastructure to the cloud, concerns about latency and cost feasibility arise. Businesses that use IoT sensors, actuators, and other IoT devices are increasingly seeking edge computing solutions like edge nodes, devices, and hyper-localised data centres. Edge computing augments the existing cloud archetype in IoT applications by bringing data processing closer to the data source, allowing enterprises to make better decisions faster.

The anatomy of IoT applications and challenges at the edge

Cloud-first IoT architecture

In a cloud-centered IoT design, devices connect to the cloud directly and exchange data with a remote data centre. This well-known design serves as the foundation for numerous web applications. The main advantages of cloud-based architecture include implementation robustness, easy management and control of IoT devices through centralised mechanisms, and device implementation simplicity. Data centralisation enables the processing of larger volumes of data and more accurate conclusions using machine learning technology.

Cloud-first applications do, however, come with limitations which should be considered. The cost of cloud design is considerable, and latency is unacceptable for many applications that require rapid answers from massive amounts of data.

From a security perspective, cloud app privacy concerns can be expected when all data is received and stored by third-party cloud operators. Centralised systems can be vulnerable to attack, potentially jeopardising the entire network of devices.

Peer-to-peer architecture

Direct, peer-to-peer communication is another method for Internet of Things (IoT) devices and apps to communicate with one another. Data is shared directly between peers in a peer-to-peer (P2P) network, with the server acting only as a connection broker. Each peer establishes a connection with a central server, mediating a direct end-to-end encrypted connection between them. Once the direct connection is established, the server is eliminated from the picture. This means that all data is stored on the devices and may be retrieved directly from them.

The benefit of this technique is privacy, as data is only transferred between devices with direct access. The dispersed design of the system strengthens security by forcing attackers to compromise individual devices to get access to more data. Because direct connections have a reduced latency, the flow of information can be substantially faster without the involvement of an intermediary. P2P systems are scalable, as an increase in the number of devices does not always indicate the need for additional processing capabilities.

Because P2P architecture is decentralised, it can prove an administrative challenge to conduct regular data backups. P2P raises the issue of trust and the provisioning of valuable services on top of IoT devices.


The hub-and-spoke design lies between cloud-first and peer-to-peer architectures. This strategy is used for IoT devices with limited resources and must connect to a hub before connecting to the Internet. The hub serves as a device gateway and implements network security and wide-area network (WAN) protocols. Hubs perform data pre-processing, use network controls and local data to configure and administer various devices in their local network.

The benefits of P2P networks are preserved in hub-and-spoke designs. Hub-and-spoke networks are easy to manage, and the method is well-known. However, hubs also pose potential risks. They may have a single point of failure. There is a high risk of vendor lock-in with equipment more difficult to replace than a cloud service and potential issues with serviceability.

Challenges at the edge

Network security has emerged as a critical factor in edge computing applications. As a result of decentralised apps, attacks have become more complicated and persistent. Traditional security measures focused on the network’s perimeter have proven ineffectual as standalone security strategies. Modern network security must deal with an ever-changing and diverse set of users and devices and significantly more pervasive attacks targeting formerly “trusted” portions of network infrastructure.

Edge apps generate massive volumes of unstructured data daily. This data provides real-time insights that can increase corporate efficiency, improve consumer experiences, and new revenue opportunities. Turning real-time insights into actionable insights requires analysing and processing data at the source – the edge – where people, devices, and things connect to the digital world. Using unstructured data at the edge necessitates a network that uses artificial intelligence to process data at a rate and volume well beyond what is humanly achievable. It requires infrastructure with an AI-powered “sixth sense” that identifies possible problems ahead of time, proposes a course of action, and uses automation to transform those suggestions into logical actions, all without manual intervention.

Aruba ESP

Aruba ESP (Edge Services Platform) assists organisations in managing their edge installations and accelerating transformation by continuously analysing network, user, and device information. Aruba ESP transforms data into knowledge to ensure business continuity with a single, cloud-native platform that can be deployed on-premise or in the cloud. This protects and integrates corporate infrastructure.

Aruba ESP is founded on the following fundamental principles:

AIOps is a critical component of Aruba ESP that leverages AI and analytics to precisely identify root causes of network issues and automatically remediate them. It then monitors user experience in real time, tunes the network to prevent problems before they occur, and continuously optimises and secures the network through peer benchmarking and prescriptive recommendations. AIOps is especially successful in actual customer deployments, as it dramatically increases throughput capacity, reduces issue resolution time, and improves end-user and IT experience.

With Unified Infrastructure, Aruba Central unifies all network operations for switching, Wi-Fi, and SD-WAN across campus, data centre, branch, and remote worker settings. Aruba’s unified infrastructure strategy gives clients the option of running controller services on-premise or in the cloud, allowing maximum flexibility at enterprise scale.

Zero Trust Network Security integrates role-based access technology, dynamic segmentation, and identity-based intrusion detection. This enables it to authenticate, authorise, and control every user and device connecting to the network and detect, prevent, isolate, and stop attacks before they disrupt business.

Aruba ESP enables the intelligent edge and empowers organisations seeking to accelerate transformation and ensure business continuity.


Given the importance of capitalising on opportunities at the edge, organisations should consider prioritising a solid network foundation. In the design of your IT infrastructure, it’s important to meet today’s expectations while preparing for the next significant technological advancement.

Aruba ESP is the industry’s first platform with AI-powered modules capable of meeting the needs of intelligent edges. Aruba ESP uses AIOps, Zero Trust security, and unified infrastructure principles to help IT and the network handle the velocity and volume of data generated and processed at the edge. It mitigates advanced threats from a vanishing security perimeter, and operational challenges posed by increasingly complex network architecture. As your IT partner, we can help you set up the best architecture with the relevant elements of Aruba ESP for your business needs.