Cloud computing has become one of the most popular methods for server hosting, thanks to the scalability and reliability it offers. A vast range of services rely on hosting within the AWS, Google Cloud, and Microsoft Azure platforms. However, much like devices on private networks, cloud platforms are still susceptible to malicious attacks.
One of the most significant recent data breaches occurred in 2019. An American finance group, First American Financial Corp, made themselves the victim when bank account numbers, bank statements, mortgage and tax records, social security numbers and driver's license images of around 885 million users were compromised. This data was easily accessible by anyone with a web browser, simply by finding out the URL of the document and loading the page, as no authentication system was in place to protect these documents.
Every year, the news is filled with stories of such security breaches. So, how can your company take action to ensure the data you store in the cloud remains secure? Read on to learn more about AWS infrastructure security best practices.
Amazon has a shared responsibility model, which you agree to when signing up for the platform. This model highlights the responsibilities Amazon has to safeguard the security of their systems, and what you need to do to adhere to their security guidelines when working in the cloud. Following these AWS security best practices is beneficial to the security of the entire cloud platform.
Amazon takes full responsibility for the software and hardware used to make their services available to you. They maintain their systems by ensuring their software is kept up to date, which means they must promptly replace any failing hardware. AWS security services are monitored by Amazon to ensure they adhere to these guidelines. They also take full responsibility for the security configuration of its accompanying services, including DynamoDB, Redshift, Elastic MapReduce and Workspaces. If there is a breach of security within this aspect of their cloud platform, they take responsibility for the repair of machines as well as compensation when data loss occurs.
Don’t think the provider absorbs all liability, however. As a user of their platform, Amazon expects you to take reasonable steps to secure your server clusters. Proper use of IAM user permissions is one requirement, alongside the use of two-factor authentication on accounts with significant access privileges. Where Amazon decides that you have neglected these basic security principles, they cannot take responsibility for loss of data if your Amazon Web Services security has been compromised.
In summary, as a user of AWS, you may need to assume the following responsibilities:
One of the worst things you can do, in terms of security, is to use a root account for day-to-day functions in the cloud.
On a consumer Windows operating system, there is a feature called User Account Control (UAC), which allows programs to run with an elevated user state. This segregation of permission helps to mitigate the chance of malicious code causing damage to the system. This is because the regular user account does not have the authority to change system files.
The Identity and Access Management (IAM) user account feature in AWS is very similar in function to UAC on Windows. Rather than granting root access to all the data on your server, an IAM can be granted specific rights on the system. This granular control allows specific functionality to be granted to specific users, significantly reducing the chance of widespread data loss in the cloud. You can also implement various rules for IAM accounts, such as setting a minimum password complexity, alongside requesting that the password be changed as often as you deem necessary.
You can grant these IAM accounts similar permissions to a root account, with the added security of knowing you can disable these accounts if they become compromised. The root account is the only way you can change your service plan or delete your server cluster on AWS, so it’s always best to minimize root account usage with the strategic use of IAM users.
It is important to consider what Custom Applications security best practices you should implement, especially when you are storing confidential information that needs to be available to end-users.
We recommend keeping an inventory of each custom application running on your cloud infrastructure. This inventory should also include a set of compliance requirements, which security teams should regularly review. Maintaining a list of threats to the custom application is also a good practice, as it allows security personnel to monitor these applications in a more focused manner.
In the computing world, less is certainly more with permissions. Overly permissive custom application policies on the network can expose sensitive information to the wrong employees, meaning it is better to keep permissions to a bare minimum. You can always grant more access permissions when needed, but you cannot revoke them once the damage has been done.
With custom applications that handle highly sensitive datasets, the data should be encrypted to render it useless to those without the method of decryption needed. One of the most popular methods of encryption is AES, which can create keys up to 256-bits in size. Use this on datasets like protected health information (PHI) and personally identifiable information (PII) such as addresses, names, and identification numbers.
It is vital to pay attention to your security practices when working in the cloud. Unlike private server systems, the public cloud is much more easily accessible by those with malicious intent.
One of the best ways to secure your cloud instances is by using the IAM user accounts function. Focus time and effort on making policies that only allow access to those that need it. Never use a root account for any general computing tasks, as this is a breach of the AWS T’s and C’s and will leave you susceptible to complete attacks on your servers.
Keep an audit trail for every change on your network to effectively track how users and administrators work on your servers. This way, you can detect malicious activity before it causes irreparable damage, protecting your data and reputation.
Finally, pay close attention to custom applications running on your server. These are not covered by Amazon’s policy in any way, leaving you solely responsible for the data contained within. Use a mixture of encryption and account policy to keep data secure when granting users access to these custom applications.
Contact Us Today
What Is an SQL Query Engine? SQL query engine architecture was designed to allow users to query a variety of data sources within a single query. While early SQL-based query engines such as Apache Hive allowed analysts to cut through the clutter of analytical data, they found running SQL analytics on multi-petabyte data warehouses to be a time-intensive process that was difficult to visualize and hard to scale.Explore
A Winning Base for Successful Digital Transformations When it comes to developing a successful digital strategy, it is not just corporations planning to maximize the benefits of data assets and technology-focused initiatives. The Government of Western Australia recently unveiled four key priorities for digital reform in its new Digital Strategy for 2021-2025.Explore
Engage Your Workforce with a Modern Employee Intranet Solution The employee intranet has changed significantly since it was first introduced in the early 1990s. What started as HTML-based static portals have now evolved into intuitive communication tools complete with search engines, user profiles, blogs, event planners, and more. Today, many organizations are taking a second look at employee intranets to bridge gaps between teams, build company culture, centralize information, increase productivity, and improve workflow.Explore
Adopting emerging cloud technologies, consolidating resources, and improving processes is the key. “IT no longer just supports corporate operations as it traditionally has but is fully participating in business value delivery. Not only does this shift IT from a back-office role to the front of business, but it also changes the source of funding from an overhead expense that is maintained, monitored, and sometimes cut, to the thing that drives revenue,” said John-David Lovelock, research vice president at Gartner.Explore
Deliver Powerful Insights Instantaneously with Federated Queries - No Matter Where Your Data Resides The concept of federated queries isn’t new. Facebook PrestoDB popularized the idea of distributed structured query language (SQL) query engines in 2013. Over the years, AWS, Google, Microsoft, and many others in the industry have accelerated the adoption of a distributed query engine model within their products. For example, AWS developed Amazon Athena on top of the Presto code base, while Google’s BigQuery is based on Cloud SQL.Explore
What is Unstructured Data? Almost 80% of the data that enterprises and organizations collect is unstructured - data without a set record format or structure. Unstructured data includes data such as emails, web pages, PDFs, documents, customer feedback, in-app reviews, social media, video files, audio files, and images.Explore