By: Mohammad Jamal Tabbara, Head of Solutions Architect, Middle East, Turkey, & Africa at Infoblox
First identified by Gartner in 2019, Secure access service edge or SASE is a new model for networking and security designed to meet the growing demand for network architectures that are more fluid, secure, and easy to manage.
The SASE concept addresses the network complexities associated with distributed enterprises. It is a response to the constraints of traditional hub-and-spoke architectures, tool proliferation, siloed solutions, and manual processes that prevent organizations from moving at cloud speed. A SASE network merges networking and security capabilities, is built using cloud-native principles, is delivered in the cloud on an on-demand, SaaS basis and relies on DDI as a foundational, unifying layer.
It is anticipated that in the future, we might see AI playing a crucial role in SASE network security by enabling real-time threat detection and response, as machine learning algorithms can soon be able to train itself to detect anomalies and patterns in network traffic, allowing organizations to respond quickly to potential threats or even trigger an automatic remediation on both security and network levels.
SASE: simplifying the network edge
Using cloud-native architectures, SASE unifies networking and security into a single platform, informed with the user’s network context. Organizations deploy and manage a SASE network from the cloud as SaaS-based capabilities.
In a SASE framework, the burden of managing and securing a network moves from labor-intensive, server-based appliances in the data center to virtual and containerized applications in the cloud. As a result, SASE networks enable organizations to:
- Simplify management
- Scale elastically
- Dynamically deploy networking and security capabilities as needed
- Consume versatile network and security capabilities as cloud-based applications
Key integration capabilities missing from SASE implementations
You might be forgiven for assuming that if you deploy a unified SASE platform, your network would consequently be enterprise- grade at the edge. If only it were that simple. Unfortunately, SASE implementations currently lack essential integration necessary for enterprise-edge networking. That should come as no surprise. After all, the vital integration we’re referring to here involves the same core network services that most IT, networking, and security organizations consistently undervalue.
How core network services enable SASE networks
In a SASE model, networking and security capabilities are intended to be tightly integrated and interoperable. And yet, to achieve its full potential for integration, a SASE-based platform must be able to harness the network services that are common to all networking and security functions — DNS, DHCP, and IP address management. Most SASE implementations, as currently offered, do not adequately integrate these services into their platforms. When incorporated as a foundational layer in SASE-based networking, core network services deliver the following advantages:
- Centralized visibility. Data residing in core network services provide enhanced network visibility for SASE implementations, enabling networking and security teams to monitor and manage devices and application usage centrally across physical, virtual, and cloud infrastructure.
- Network user context. Similarly, core network services, such as DNS, enable SASE deployments to optimize network operations and automatically secure application access at the edge.
- Local survivability for distributed locations. Resilience is a top priority for a SASE network. Applied effectively, core network services can ensure continuous Internet access for distributed locations any time they lose connectivity to headquarters.
Cloud-native delivery for core network services
The SASE framework emphasizes the use of cloud-native design in networking components. As you weigh your options for incorporating core network services into the design of your edge network, seek out solutions built using cloud-native microservices and containers. Containerized instances of DDI services are faster and easier to manage than virtualized alternatives. They also consume far fewer resources and deliver extremely low latency, providing you with the dexterity and resilience you need at the network edge.