Firewalls are among the most useful information security and compliance tools. Their role is to monitor traffic moving between network borders to determine whether it should be allowed to pass. Among other responsibilities, firewalls prevent unauthorized access to networks on which sensitive data is stored, making them an essential tool for businesses seeking to comply with regulations and standards that include HIPAA, PCI DSS, GDPR, SOC 2, and more. 

This article explores the AWS Network Firewall, a firewall available to businesses that host sensitive data on the Amazon Web Services (AWS) platform. 

What is the AWS Network Firewall?

AWS Network Firewall is a managed, auto-scaling firewall and intrusion detection and prevention service that protects Amazon Virtual Private Clouds (VPCs). It monitors and filters unwanted and unauthorized traffic into and out of VPCs. AWS Network Firewall is one of several firewalls available on the AWS platform, including Security Groups, Network Access Control Lists, and the AWS Web Application Firewall.

The AWS Network Firewall is designed to be straightforward to use and to require minimal infrastructure management following the initial deployment. As a managed service, it can be deployed quickly. It scales automatically with network traffic, removing the need for businesses to build and operate infrastructure to support essential network traffic monitoring and filtering. 

AWS Network Firewall is in scope for a wide range of AWS compliance programs, which means it can be used as part of a secure system that complies with HIPAA, PCI DSS, FedRAMP, and other frameworks. However, it should be emphasized that using AWS Network Firewall is not sufficient to achieve compliance with any framework; compliance is ultimately the responsibility of AWS users. 

AWS Network Firewall Features

We’ve already discussed some of AWS Network Firewall’s headline features: it’s a managed service for monitoring and filtering network traffic to and from Amazon VPCs. But there are other features that set it apart from alternative firewall services on the platform. 

  • AWS Network Firewall operates as both a stateless and stateful firewall. Users can configure stateless rule groups that examine packets in isolation or stateful rule groups that consider the packet’s context; for example, is the packet a response to a request from a particular IP address?
  • It is a high-availability auto-scaling firewall. As a managed service, Amazon handles redundancy and scaling, so users can rely on their firewall’s infrastructure to grow and shrink in line with demand. 
  • AWS Network Firewall includes an intrusion detection and prevention system. It monitors the flow traffic in real-time and can adapt to protect networks against vulnerability exploits and brute force attacks. 
  • AWS Network Firewall integrates with other AWS security services, including the AWS Firewall Manager, allowing users to consistently organize and manage rule groups and policies. 
  • Users can take advantage of managed rule groups, predefined rules that Amazon automatically updates to account for new software vulnerabilities. Managed rule groups significantly reduce the time and effort required to keep rules up-to-date. 

We’ve highlighted some of the most attractive features here, but you can see a complete breakdown of AWS Network Firewall features in the service’s documentation

Is AWS Network Firewall Layer 7?

AWS Network Firewall operates at Layers 3-7. These numbers refer to the OSI Model, which divides network communications into seven layers. Traditional firewalls operate at Layer 3, the network layer. They can inspect and filter packets traveling over the network, but they cannot, for example, identify attacks that exploit vulnerabilities in web applications—they have no insight into protocols that operate at Layer 7, the application layer.

In contrast, AWS Network Firewall can filter VPC network traffic at the network, application, and other layers. It is a flexible network filtering and intrusion detection service that complements AWS’s other firewall services. 

What Are AWS Network Firewall Deployment Models?

To understand AWS Network Firewall deployment models, we first need to discuss how the firewall works. In short, network traffic to the VPC is routed to a firewall end-point to be examined before it enters or exits the network. The firewall endpoint is deployed within a subnet of a VPC. Ingress and egress traffic flows through the firewall endpoint subnet and then to other protected subnets containing your cloud infrastructure. 

Deployment models influence where the firewall endpoint subnet is deployed. In a typical distributed deployment model, a firewall subnet is deployed into each virtual private cloud—each VPC has its own firewall subnet. This model allows VPCs to have an independently managed firewall with a unique firewall policy. It is typically used to monitor and filter traffic between the internet and a protected subnet, although there are other use cases. 

In contrast, a centralized deployment model uses a centralized VPC into which one or more firewall subnets are deployed. This model is often used to inspect traffic flowing between VPCs or between a VPC and a business’s on-premises infrastructure. You can read more about deployment models in Deployment models for AWS Network Firewall.

AWS Network Firewall vs. Security Groups and NACLs

AWS Network Firewall is one of several firewall services available on AWS. 

  • Security Groups are stateful firewalls that filter traffic to Elastic Network Interfaces typically used with EC2 instances. Security groups provide granular filtering for individual instances.
  • Network Access Control Lists (NACLs) are optional stateless firewalls associated with one or more subnets within a virtual private cloud. 
  • Amazon WAF is a web application firewall that filters traffic for web applications and APIs, allowing users to block common attacks such as those included in the OWASP Top Ten.

You might be wondering why AWS needs so many firewalls. They each play a distinct role. AWS Network Firewall protects the perimeter of your virtual private cloud. It controls inbound and outbound traffic for the entire network. 

In contrast, security groups are associated with individual EC2 instances and some other services. NACLs are an additional firewall that controls traffic to and from subnets, allowing users to configure rules that apply to multiple groups of instances and control traffic flowing between subnets. 

Together, these firewalls give users enormous flexibility in configuring access to instances, subnets, and VPCs. For example, you may want to allow connections of a specific type into your VPC with AWS Network Firewall, but to have Network Access Control Lists that deny similar connections access to particular subnets or instances. Another use case for multiple firewalls is to run production and testing subnets, which should be able to receive requests from external networks but should not be able to communicate directly with each other. 

AWS Network Firewall is one component of a layered approach to cloud security. To learn more, visit our extensive cloud security and compliance resources or contact a cloud security specialist to discuss KirkpatrickPrice’s cloud security audit and compliance audit services.

Have you considered moving your business’s data center to the cloud? The proportion of businesses operating an in-house data center declined over the last decade. Many—from small companies to multinational corporations—migrated their workloads to the cloud. Estimates suggest that about a third of businesses run more than 50% of their workloads in the cloud, and the majority run at least some workloads on cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), or their competitors.

Within this article, we will explore why businesses migrate their data center to the cloud and how it may be the right decision for your business’s long-term technology strategy.

Data Centers vs. The Cloud

Before cloud computing, there were several options for hosting technology infrastructure. A large company might invest in building, equipping, and staffing a data center. Smaller companies may instead use an on-site server room or server cupboard. Alternatively, businesses could buy server hardware and colocate it in a data center managed by a third party.

Over the years, many different data center hosting models developed. Still, they were similar in one way: the user paid for and managed physical infrastructure housed in a data center facility.

In contrast, “cloud” is a broad term for compute, storage, and software services that do not require users to manage or interact with physical hardware, which is managed by the vendor and resides in their data centers. 

Cloud services are typically divided into three main categories:

  • Infrastructure as a Service (IaaS) provides virtual servers, networks, and other infrastructure on which users can host their software.
  • Platform as a Service (PaaS) provides higher-level services for hosting websites and applications. PaaS platforms simplify IT management by combining compute, storage, networking, and related software services into a single platform.
  • Software as a Service (SaaS) provides software hosted on the operator’s infrastructure and accessed by the user over the internet.

Today, there are many additional “X as a Service” cloud modalities that reflect the diversity of products offered via the cloud model. For example, Database as a Service, Disaster Recovery as a Service, Desktop as a Service, and others.

Cloud Migration Benefits

We’ve discussed the differences between cloud and non-cloud infrastructure hosting, but why have so many businesses chosen to migrate their data center to a cloud platform? Let’s explore five benefits that make the cloud an attractive proposition.

Scalability and Elasticity

Scaling is among the most challenging aspects of managing a data center. Infrastructure requirements change over time, but they rarely grow smoothly and predictably, often fluctuating by season or time of day. Traffic spikes may demand resources many times the average, and your data center must cope. That means investing in servers and network infrastructure that will be idle for most of its life.

In contrast, cloud infrastructure scales with demand. A cloud platform’s virtual infrastructure is built on a large pool of computational resources—the physical infrastructure the platform vendor is responsible for. Cloud users can take advantage of as much or as little of that pool as they need. Instead of researching, buying, configuring, and maintaining physical servers in their data center, a cloud user simply deploys more virtual resources—a process that can be automated.

Elasticity is a consequence of the cloud’s ability to scale quickly. An elastic infrastructure deployment can grow or shrink in line with user demand. There’s no need to deploy idle infrastructure in anticipation of traffic spikes. Businesses can instead adjust cloud deployments to match current requirements.

Reduced IT Costs

We have already hinted at one way migrating to the cloud reduces IT costs. The cloud’s scalability allows businesses to adjust deployed resources to match demand. Unlike a physical data center, cloud platforms operate with on-demand pricing: users pay for the resources they consume after they are used. In contrast, data centers require significant upfront investments based on uncertain predictions about future resource requirements.

Other ways migrating to the cloud can reduce IT spending include:

  • Lower staffing requirements for equipment maintenance.
  • Reduced real estate spending compared to owned data centers.
  • Reduction of capital expenses and the transfer of IT capital expenditure to operational budgets.
  • Economies of scale through sharing physical hardware with multiple users.

Although cost savings are a benefit of cloud platforms, it should be pointed out that businesses may fail to save money in the cloud. If cloud environments are improperly managed and monitored, companies may pay far more than anticipated. This is particularly true for businesses that lack experience in managing cloud infrastructure.

Enhanced Business Agility

Extended lead teams are expected when deploying hardware in a self-managed or colocated data center. It’s not unusual for lead times to stretch to months when research, acquisition, shipping, deployment, and configuration are accounted for.

Cloud platforms, in contrast, allow businesses to deploy new infrastructure in minutes, as we’ve already mentioned. But building on that advantage is the ability to automate cloud deployment and configuration. The programmability of cloud platforms empowers businesses to build continuous integration and deployment pipelines that allow developers to iterate on code and push new features into production with minimal delay.

Reduced Infrastructure Management Burden

While every company needs IT infrastructure, it rarely makes sense for businesses to own and manage a data center. Managing data centers, servers, and networks is complex, expensive, and time-consuming. But it is not in itself a revenue-generating activity. Migrating to a cloud platform allows companies to focus on the applications and services that support their operations while leveraging a cloud vendors’ greater data center resources, expertise, and experience.

Improved Security and Compliance

Migrating to the cloud outsources some security issues to the cloud vendor. For example, when you deploy a virtual server on EC2—AWS’s IaaS service—you don’t have to worry about securing the underlying physical servers and networks. Amazon takes care of it. Additionally, all the major cloud platforms offer world-class security tools and services, such as firewalls, network monitoring and alerting, encryption, secret management, and more.

Cloud platforms can also help businesses comply with information security and privacy regulations. AWS, Microsoft Azure, GCP, and other cloud vendors implement compliance programs that support compliant infrastructure environments.

However, cloud vendors operate a shared responsibility model. The vendor has some security and compliance responsibilities, but so does the user. As we’ve previously written, many of the most common cloud security vulnerabilities result from user error and misconfiguration.

Continuing the EC2 example above, AWS protects the hardware a virtual server runs on, but it does nothing to stop a user from installing insecure software or running SSH with the root user’s password set to “pa55word.” Consequently, although EC2 can be HIPAA-compliant, that doesn’t prevent users from making mistakes that result in HIPAA breaches.

KirkpatrickPrice Helps Companies Stay Secure and Compliant in the Cloud

KirkpatrickPrice is a licensed CPA firm specializing in information and cloud security. Our cloud security audits and compliance audits help businesses verify and demonstrate their security and compliance. To learn more, contact a cloud security and compliance specialist or visit our cloud security resources.

In 2022, data protection is (or should be) a top priority for any business that collects sensitive data, whether that’s personally identifiable information (PII), financial data, intellectual property, or business information. Regulatory compliance is often the primary motivation for implementing a data protection strategy. The penalties for non-compliance with HIPAA, the GDPR, PCI DSS,  the CCPA, and other data privacy regulations can damage or even destroy a small or medium business.

That is, of course, the purpose of data privacy regulations. They make the cost of non-compliance so high businesses are motivated to implement data protection best practices. However, there is another reason companies should invest in data protection: it’s great for business. 

Consumers and business decision-makers are more aware of data breach risks than ever before, and they factor a vendor’s data protection credentials into buying decisions.  Data protection is a competitive advantage, and it should be a prominent aspect of your marketing and sales strategy. 

What Is Data Protection?

Data protection is the activities and technologies an organization implements to protect data from theft, unauthorized access, and improper use. Data protection is a broad term that covers a wide range of activities, but its fundamental purpose is to establish a relationship of trust between a business and its customers. Your customers need to know that they can trust you to protect their data. 

Data privacy is one of the most prominent activities covered by the umbrella term “data protection,” but there are others, including using data only for the purposes a customer has consented to and giving customers the ability to access or delete their PII. 

Implementing data protection best practices allows businesses to comply with data protection and data privacy regulations and standards. But, just as important, it reassures customers that your company is capable of keeping their data safe while using it responsibly. 

Why Is Data Protection Important?

Data protection was not a key concern in the early days of the consumer internet, and many businesses failed to follow even rudimentary data protection best practices. But, as the web and cloud services became vital to the economy, increasing quantities of sensitive data were stored and processed by businesses. Perhaps predictably, data breaches and identity theft became common. The media’s focus on massive breaches that leaked millions of sensitive records brought the consequences of poor data protection to public attention. 

In 2022, consumers and businesses have a more sophisticated understanding of data breach risks. Most are happy to use online services, even for sensitive data. But, in return for their trust, they expect businesses to prioritize data protection and implement processes, practices, and technologies that keep data safe. Companies that can’t or won’t implement and demonstrate rigorous data protection practices are at a disadvantage relative to competitors who put data protection front and center. 

How To Use Data Protection To Gain a Competitive Advantage

To leverage the competitive advantage of data protection, it’s not enough to implement secure systems and update your website with copy that boasts: “we’re secure.” Your competitors say the same, and customers cannot verify which claims are accurate. Let’s explore a four-step process businesses can follow to implement, demonstrate, and promote their data security credentials. 

Implement Data Protection Best Practices

Most importantly, your business has to implement data protection best practices that comply with relevant regulatory standards. The details depend on the industry your company operates in, the data it stores, the data protection expectations of its customers, and many other factors. 

If your business lacks the knowledge or expertise to implement data protection best practices, we recommend consulting with a third-party data protection specialist, who will identify risks and help your business to create and implement a compliance plan. 

Create Transparent Data Protection Policies

Create and publish data protection policies that non-technical employees or customers can understand. It may be tempting to use technical or legal language, but the average customer may not understand it. Instead, explain clearly and concisely:

  • Which data you intend to collect.
  • Why you are collecting it.
  • How you will use it.
  • How you will protect it.

If there are legal reasons that compel your business to use technical language in its public-facing policies, you may want to consider publishing a parallel explanation or summary in plain English. 

You may also want to explain the customers’ obligations to protect their data. For example, cloud platforms such as Amazon have well-explained data protection policies, but they make it clear that data protection is a shared responsibility

Demonstrate Your Data Protection Capabilities with Information Security Audits

How do your customers know you keep your data protection promises? It’s easy to say data protection is a priority, but it’s hard for customers to verify businesses are fulfilling their obligations. If you’d asked the companies behind the biggest data breaches of recent years whether they take data protection seriously, they would have said, “Of course, we do!”

The standard solution to this problem is a third-party audit. Businesses ask a neutral third party with information security and data protection expertise, like KirkpatrickPrice, to examine and report on data protection controls. Audits are carried out with reference to an accepted framework, and auditing methods are standardized. Consequently, the business and its customers can be confident that a third-party audit reflects the reality of the auditee’s data protection implementation. 

Audits can be carried out with reference to many different security standards and regulatory frameworks, including:

Compliance audits verify the business complies with a specific framework or standard, highlight control gaps and opportunities to improve data protection, and provide a report that demonstrates security and compliance capabilities to potential customers and partners. 

Make Data Protection a Foundation of Your Brand

The next step is to make sure prospective customers know your information security, data protection, and data privacy stances. In some industries, business customers will ask vendors whether they comply with standards such as SOC 2 as a matter of course—it’s part of their compliance procedure. However, as data protection becomes increasingly important to all customers, it should be mentioned alongside your business’s other value propositions in marketing and sales materials. 

Opportunities to highlight data protection and compliance audit certificates include:

  • In sales copy on your website, including case studies, blog articles, and one-pagers.
  • In sales enablement content and sales professional training. Your sales team should emphasize data protection and privacy as key benefits. 
  • On social media, in email marketing, and in content marketing efforts. 

In short, businesses should take every opportunity to highlight the link between their services and superior data protection and information security.

Partner With KirkpatrickPrice to Implement Data Protection Practices That Are Best for Your Business

KirkpatrickPrice is a licensed CPA firm specializing in information security compliance audits and related services, including penetration testing, security awareness training, and risk assessments. To learn more about data protection and compliance audits, contact our security and compliance specialists

The Amazon Simple Storage Service (Amazon S3) celebrated its 15th birthday in 2021. S3 was conceived as a straightforward scalable object storage system developers could use without concerning themselves with files systems—everything on S3 is an addressable object in a bucket.

S3 quickly rose to dominate the object storage space. Because it is used everywhere, AWS S3 security as well as the privacy and confidentiality of the data businesses store in it are critical. A vulnerability in S3 would inevitably lead to data exposure on an unprecedented scale. Amazon understands this and has built security features into S3 and integrated it with security and privacy services such as AWS Identity and Access Management (IAM).

But, as with all cloud services, security is partially the responsibility of users. If S3 buckets are poorly configured, sensitive data may be exposed. This article explores ten S3 best practices your business can implement to avoid becoming the star of the next big S3 data leak story.

Ensure S3 Buckets are Not Publicly Accessible

Data leaks from S3 buckets often occur because a bucket containing sensitive files is configured to allow public access. This means anyone on the internet who knows where the bucket is can access the files. Bad actors have created tools that make it straightforward to discover buckets with public read permissions.

When buckets are first created, they are not publicly accessible. However, rather than setting up secure Bucket Policies or managing access with IAM identities, users often configure buckets for public access. This is often done for convenience: the user wants a group of people to access the data and doesn’t understand how to provide that access securely.

To check whether your buckets are publicly accessible, log into the S3 Console, click on a bucket, and select the permissions tab. Access permissions are displayed at the top. The prominent “Block public access” setting revokes the bucket’s public access configuration immediately.

You can also use the KirkpatrickPrice AWS Security Scanner to check for insecure S3 bucket permissions and other AWS cloud security vulnerabilities.

Configure Least Privilege Access

Removing public access is an essential step towards better AWS S3 security, but it is only the first step. In addition to ensuring that data can’t be accessed by everyone, you should ensure it can only be accessed by those who need the data. For example, if you want to share data in a bucket with a third party, they may only need read permissions and not write permissions. 

There are several ways to configure access permissions on buckets, but you should ordinarily use either bucket policies or IAM identities.

Both methods improve Amazon S3 security, but IAM identities are more flexible and granular. As a general rule, it is preferable to use IAM identities as part of a comprehensive identity and access management strategy. A third access control option is Access Control Lists (ACLs); however, Amazon recommends using bucket policies or IAM identities instead.

Implement S3 Encryption At Rest

Data stored in S3 buckets should be encrypted. Encryption ensures the data cannot be read if it is exposed through a vulnerability or misconfiguration. S3 provides three server-side encryption options:

  • SSE-S3 — encryption with keys managed by the S3 service.
  • SSE-KMS — encryption using keys stored in AWS Key Management Service.
  • SSE-C — encryption using keys provided by the customer.

Any of these options significantly improve security compared to storing unencrypted data in S3. However, SSE-KMS gives the user more control over their keys, allowing them to, for example, rotate keys as required.

Implement S3 Encryption in Transit

In addition to encrypting data at rest in Amazon S3, it should be encrypted in transit as it moves over the network. Data is automatically encrypted within the AWS network, but users should consider leveraging SSL/TLS when moving data across external networks, including the internet.

Store S3 Credentials Securely

If your applications access data stored in S3 buckets via the API, they will need to authenticate. To do so, they will use an AWS access key, a long-term credential associated with an IAM user that is used for programmatic authentication.

Improper use of AWS access keys can create security vulnerabilities. One common mistake is to embed access keys in code. Access keys embedded into code and then shared on version control platforms have been the root cause of many data leaks.

AWS access keys should be securely stored in AWS Secrets Manager, as we discussed in depth in How to Keep AWS Access Keys and Other Secrets Safe.

Use IAM Roles for Temporary S3 Access

Roles are IAM identities with a set of permissions. However, roles are not associated with an individual user, although users and other entities can assume a role to take on its permissions. In this context, the main benefit of roles is that they can be used to create temporary credentials which expire after a specified period, in contrast to IAM users’ access keys, which are permanent until deleted.

Enable Multi-Factor Authentication for IAM Users

Multi-factor authentication adds an extra layer of security to the standard username and password authentication. With MFA enabled, users must supply an additional factor of authentication—a one-time code or a hardware security key. Usernames and passwords can leak or be shared inappropriately. TFA ensures that accounts remain secure even if credentials are exposed.

Enable S3 Access Logs

Access logs allow administrators to identify unusual and unexpected access patterns that may indicate a security breach. They are also useful when analyzing security incidents to discover which data has been exposed, information that may be essential to fulfilling regulatory requirements.

S3 does not ordinarily log who has accessed data and which data they have accessed, but users can activate access logs. Amazon will log access requests and store the resulting log files in a different S3 bucket. The log storage bucket should have strict access permissions to ensure bad actors can’t alter the log or use the information it contains to plan an attack.

Classify Data Stored in S3 Buckets

Many regulatory standards govern the secure storage of sensitive data, particularly health data, financial data, and personally identifiable information (PII). S3 is a viable option for storing sensitive data, if correctly configured. But to be compliant, it’s important to know which data you’re storing in the first place—accidentally dumping a database full of PII in a bucket with broad access permissions is likely to result in compliance and audit failures.

Before data is stored in S3, it should be classified and subject to a risk assessment so that businesses are aware of what they are storing and the associated risks. Amazon provides a service that can help businesses to discover sensitive information in S3 buckets. Amazon Macie is a data privacy service that uses machine learning and pattern matching to automatically identify sensitive data and alert users about insecure access permissions.

Verify S3 Bucket Configurations

Our last Amazon S3 security best practice is to check bucket configurations and IAM permissions regularly. Over time, your AWS environment will evolve from its initial conditions. 

Partner with KirkpatrickPrice to Improve Your S3 Security

The KirkpatrickPrice AWS Security Scanner and cloud security audits help businesses verify their cloud security and privacy. To learn more, browse our extensive cloud security resources or contact an information security specialist today.

You thought you did everything right. You enabled multi-factor authentication (MFA) on all of your accounts and configured it so that all employees and customers are required to use it. You have automated checks set up to make sure MFA is still required. And yet you still experience a data breach. This is exactly what happened to the non-governmental organization (NGO) described in the Federal Bureau of Investigation (FBI) and Cybersecurity and Infrastructure Security Agency (CISA)’s recently released joint Cybersecurity Advisory (CSA).

In May 2021, a Russian state-sponsored actor took advantage of a misconfigured account with default MFA settings. The actor was able to register a new device for MFA and access the NGO’s network by exploiting a critical Windows Print Spooler vulnerability called “PrintNightmare.” This vulnerability allowed the Russian state-sponsored actor to run arbitrary code with system privileges, ultimately permitting them to gain access to important documents within the company’s cloud and email accounts.

This incident proves why internal audits conducted by a third-party are so important. The purpose of internal audits is to provide your organization with total assurance that your information security program is actually keeping your company’s sensitive data safe. Sometimes people will hang their hat on automated audit results that provide false assurances. An automated check can say that MFA is enabled, but an experienced professional looks at it more thoroughly than that to make sure the configurations are working as they were intended to.

We’ve seen that many of our clients are vulnerable to this same type of incident. During one of our audits, the auditor realized that the company’s developers were completely bypassing the MFA/VPN requirement. The developers were connecting to the production environment using SSH with no MFA. If the auditor had stopped after only the automated tests, the results would have said that the VPN was in place and MFA was enabled. And while those would be true statements, they don’t accurately reflect the security posture of that company’s development practices. The company would still be at risk despite the results of their audit because automation doesn’t understand the context of what the employees’ processes look like. Only a real-life person can verify these processes are working (or not working) like they are intended to, so that a company can have total confidence in their security practices.

A Cybersecurity Checklist Isn’t Enough

If your organization wants total confidence that its security practices are keeping the company safe, it isn’t enough to put a checkmark by “MFA enabled.” Your organization needs to be performing comprehensive tests over the functionality of its configurations. While we believe a cybersecurity checklist will never be enough to fully provide your organization with the assurance it needs, reviewing or testing the following security best practices are a good place for your organization to start:

  • Test the MFA enrollment process
  • Test whether disabled accounts can be used to bypass MFA requirements
  • Review the VPN configuration to ensure 256-bit encryption through modern protocols like OpenVPN or IKEv2
  • Review the VPN configuration to ensure MFA is enforced
  • Identify the method of administrative access in place to segment remote systems from production (i.e., jump server (bastion host), AWS Systems Manager, etc.) is properly segmenting systems and users
  • Review protocols enabled to administrate systems and their source (i.e., SSH or RDP over VPN from jump server only…no direct access from the Internet)
  • Review cloud application or production configuration to ensure they may only be administrated from approved network devices, once authenticated over VPN
  • Allow remote desktop access only over a VPN with MFA (no direct access from the Internet)

Only an Audit with an Experienced Security Professional Can Give You the Assurance Your Organization Needs

While all of the above steps are good practices for your organization’s configuration management processes, conducting a third-party audit with a firm like KirkpatrickPrice is the best way to gain the assurance your company needs. Only an internal audit or continuous penetration testing conducted by an experienced security professional can prove that your organization has implemented the best security controls for the protection of your sensitive data and that those controls are functioning correctly. An automated tool can check that those controls are in place, but they can’t evaluate their functionality. Our experts can find exactly how your configurations are working and provide you the guidance needed to strengthen your organization’s security posture. Because at the end of the day, it isn’t enough to just have MFA enabled. You need to be sure that your MFA configurations are keeping bad actors away from your valuable data.

KirkpatrickPrice Can Give You That Assurance

Let KirkpatrickPrice give you the assurance you need through an audit or penetration test. Contact our experts today to see which services are right for you and make sure you’re secure.