$0.00
Amazon SAA-C03 Exam Dumps

Amazon SAA-C03 Exam Dumps

AWS Certified Solutions Architect - Associate (SAA-C03)

683 Questions & Answers with Explanation
Update Date : November 10, 2024
PDF + Test Engine
$65 $95
Test Engine
$55 $85
PDF Only
$45 $75

Money back Guarantee

We just do not compromise with the bright future of our respected customers. PassExam4Sure takes the future of clients quite seriously and we ensure that our SAA-C03 exam dumps get you through the line. If you think that our exam question and answers did not help you much with the exam paper and you failed it somehow, we will happily return all of your invested money with a full 100% refund.

100% Real Questions

We verify and assure the authenticity of Amazon SAA-C03 exam dumps PDFs with 100% real and exam-oriented questions. Our exam questions and answers comprise 100% real exam questions from the latest and most recent exams in which you’re going to appear. So, our majestic library of exam dumps for Amazon SAA-C03 is surely going to push on forward on the path of success.

Security & Privacy

Free for download Amazon SAA-C03 demo papers are available for our customers to verify the authenticity of our legit helpful exam paper samples, and to authenticate what you will be getting from PassExam4Sure. We have tons of visitors daily who simply opt and try this process before making their purchase for Amazon SAA-C03 exam dumps.



Last Week SAA-C03 Exam Results

169

Customers Passed Amazon SAA-C03 Exam

98%

Average Score In Real SAA-C03 Exam

99%

Questions came from our SAA-C03 dumps.



Authentic SAA-C03 Exam Dumps


Prepare for Amazon SAA-C03 Exam like a Pro

PassExam4Sure is famous for its top-notch services for providing the most helpful, accurate, and up-to-date material for Amazon SAA-C03 exam in form of PDFs. Our SAA-C03 dumps for this particular exam is timely tested for any reviews in the content and if it needs any format changes or addition of new questions as per new exams conducted in recent times. Our highly-qualified professionals assure the guarantee that you will be passing out your exam with at least 85% marks overall. PassExam4Sure Amazon SAA-C03 ProvenDumps is the best possible way to prepare and pass your certification exam.

Easy Access and Friendly UI

PassExam4Sure is your best buddy in providing you with the latest and most accurate material without any hidden charges or pointless scrolling. We value your time and we strive hard to provide you with the best possible formatting of the PDFs with accurate, to the point, and vital information about Amazon SAA-C03. PassExam4Sure is your 24/7 guide partner and our exam material is curated in a way that it will be easily readable on all smartphone devices, tabs, and laptop PCs.

PassExam4Sure - The Undisputed King for Preparing SAA-C03 Exam

We have a sheer focus on providing you with the best course material for Amazon SAA-C03. So that you may prepare your exam like a pro, and get certified within no time. Our practice exam material will give you the necessary confidence you need to sit, relax, and do the exam in a real exam environment. If you truly crave success then simply sign up for PassExam4Sure Amazon SAA-C03 exam material. There are millions of people all over the globe who have completed their certification using PassExam4Sure exam dumps for Amazon SAA-C03.

100% Authentic Amazon SAA-C03 – Study Guide (Update 2024)

Our Amazon SAA-C03 exam questions and answers are reviewed by us on weekly basis. Our team of highly qualified Amazon professionals, who once also cleared the exams using our certification content does all the analysis of our recent exam dumps. The team makes sure that you will be getting the latest and the greatest exam content to practice, and polish your skills the right way. All you got to do now is to practice, practice a lot by taking our demo questions exam, and making sure that you prepare well for the final examination. Amazon SAA-C03 test is going to test you, play with your mind and psychology, and so be prepared for what’s coming. PassExam4Sure is here to help you and guide you in all steps you will be going through in your preparation for glory. Our free downloadable demo content can be checked out if you feel like testing us before investing your hard-earned money. PassExam4Sure guaranteed your success in the Amazon SAA-C03 exam because we have the newest and most authentic exam material that cannot be found anywhere else on the internet.


Amazon SAA-C03 Sample Questions

Question # 1

A company is developing a mobile game that streams score updates to a backendprocessor and then posts results on a leaderboard A solutions architect needs to design asolution that can handle large traffic spikes process the mobile game updates in order ofreceipt, and store the processed updates in a highly available database The company alsowants to minimize the management overhead required to maintain the solutionWhat should the solutions architect do to meet these requirements?

A. Push score updates to Amazon Kinesis Data Streams Process the updates in KinesisData Streams with AWS Lambda Store the processed updates in Amazon DynamoDB.
B. Push score updates to Amazon Kinesis Data Streams. Process the updates with a fleetof Amazon EC2 instances set up for Auto Scaling Store the processed updates in AmazonRedshift.
C. Push score updates to an Amazon Simple Notification Service (Amazon SNS) topicSubscribe an AWS Lambda function to the SNS topic to process the updates. Store theprocessed updates in a SQL database running on Amazon EC2.
D. Push score updates to an Amazon Simple Queue Service (Amazon SQS) queue. Use afleet of Amazon EC2 instances with Auto Scaling to process the updates in the SQSqueue. Store the processed updates in an Amazon RDS Multi-AZ DB instance.



Question # 2

A company runs an SMB file server in its data center. The file server stores large files thatthe company frequently accesses for up to 7 days after the file creation date. After 7 days,the company needs to be able to access the files with a maximum retrieval time of 24hours.Which solution will meet these requirements?

A. Use AWS DataSync to copy data that is older than 7 days from the SMB file server toAWS.
B. Create an Amazon S3 File Gateway to increase the company's storage space. Createan S3 Lifecycle policy to transition the data to S3 Glacier Deep Archive after 7 days.
C. Create an Amazon FSx File Gateway to increase the company's storage space. Createan Amazon S3 Lifecycle policy to transition the data after 7 days.
D. Configure access to Amazon S3 for each user. Create an S3 Lifecycle policy totransition the data to S3 Glacier Flexible Retrieval after 7 days.



Question # 3

A company has an organization in AWS Organizations that has all features enabled Thecompany requires that all API calls and logins in any existing or new AWS account must beaudited The company needs a managed solution to prevent additional work and tominimize costs The company also needs to know when any AWS account is not compliantwith the AWS Foundational Security Best Practices (FSBP) standard.Which solution will meet these requirements with the LEAST operational overhead?

A. Deploy an AWS Control Tower environment in the Organizations management accountEnable AWS Security Hub and AWS Control Tower Account Factory in the environment.
B. Deploy an AWS Control Tower environment in a dedicated Organizations memberaccount Enable AWS Security Hub and AWS Control Tower Account Factory in theenvironment.
C. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ) Submit an RFC to self-service provision Amazon GuardDuty in the MALZ.
D. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ) Submit an RFC to self-service provision AWS Security Hub in the MALZ.



Question # 4

A solutions architect is designing a user authentication solution for a company The solutionmust invoke two-factor authentication for users that log in from inconsistent geographicallocations. IP addresses, or devices. The solution must also be able to scale up toaccommodate millions of users.Which solution will meet these requirements'?

A. Configure Amazon Cognito user pools for user authentication Enable the nsk-basedadaptive authentication feature with multi-factor authentication (MFA)
B. Configure Amazon Cognito identity pools for user authentication Enable multi-factorauthentication (MFA).
C. Configure AWS Identity and Access Management (1AM) users for user authenticationAttach an 1AM policy that allows the AllowManageOwnUserMFA action
D. Configure AWS 1AM Identity Center (AWS Single Sign-On) authentication for userauthentication Configure the permission sets to require multi-factor authentication(MFA)



Question # 5

A solutions architect needs to design the architecture for an application that a vendorprovides as a Docker container image The container needs 50 GB of storage available fortemporary files The infrastructure must be serverless.Which solution meets these requirements with the LEAST operational overhead?

A. Create an AWS Lambda function that uses the Docker container image with an AmazonS3 mounted volume that has more than 50 GB of space
B. Create an AWS Lambda function that uses the Docker container image with an AmazonElastic Block Store (Amazon EBS) volume that has more than 50 GB of space
C. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses the AWSFargate launch type Create a task definition for the container image with an AmazonElastic File System (Amazon EFS) volume. Create a service with that task definition.
D. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses theAmazon EC2 launch type with an Amazon Elastic Block Store (Amazon EBS) volume thathas more than 50 GB of space Create a task definition for the container image. Create aservice with that task definition.



Question # 6

A company uses AWS Organizations to run workloads within multiple AWS accounts Atagging policy adds department tags to AWS resources when the company creates tags.An accounting team needs to determine spending on Amazon EC2 consumption Theaccounting team must determine which departments are responsible for the costsregardless of AWS account The accounting team has access to AWS Cost Explorer for allAWS accounts within the organization and needs to access all reports from Cost Explorer.Which solution meets these requirements in the MOST operationally efficient way'?

A. From the Organizations management account billing console, activate a user-definedcost allocation tag named department Create one cost report in Cost Explorer grouping by tag name, and filter by EC2.
B. From the Organizations management account billing console, activate an AWS-definedcost allocation tag named department. Create one cost report in Cost Explorer grouping bytag name, and filter by EC2.
C. From the Organizations member account billing console, activate a user-defined costallocation tag named department. Create one cost report in Cost Explorer grouping by thetag name, and filter by EC2.
D. From the Organizations member account billing console, activate an AWS-defined costallocation tag named department. Create one cost report in Cost Explorer grouping by tagname and filter by EC2.



Question # 7

A company is building an Amazon Elastic Kubernetes Service (Amazon EKS) cluster for itsworkloads. All secrets that are stored in Amazon EKS must be encrypted in the Kubernetesetcd key-value store.Which solution will meet these requirements?

A. Create a new AWS Key Management Service (AWS KMS) key Use AWS SecretsManager to manage rotate, and store all secrets in Amazon EKS.
B. Create a new AWS Key Management Service (AWS KMS) key Enable Amazon EKSKMS secrets encryption on the Amazon EKS cluster.
C. Create the Amazon EKS cluster with default options Use the Amazon Elastic BlockStore (Amazon EBS) Container Storage Interface (CSI) driver as an add-on.
D. Create a new AWS Key Management Service (AWS KMS) key with the ahas/aws/ebsalias Enable default Amazon Elastic Block Store (Amazon EBS) volume encryption for theaccount.



Question # 8

A retail company has several businesses. The IT team for each business manages its ownAWS account. Each team account is part of an organization in AWS Organizations. Eachteam monitors its product inventory levels in an Amazon DynamoDB table in the team'sown AWS account.The company is deploying a central inventory reporting application into a shared AWSaccount. The application must be able to read items from all the teams' DynamoDB tables.Which authentication option will meet these requirements MOST securely?

A. Integrate DynamoDB with AWS Secrets Manager in the inventory application account.Configure the application to use the correct secret from Secrets Manager to authenticateand read the DynamoDB table. Schedule secret rotation for every 30 days.
B. In every business account, create an 1AM user that has programmatic access.Configure the application to use the correct 1AM user access key ID and secret access keyto authenticate and read the DynamoDB table. Manually rotate 1AM access keys every 30days.
C. In every business account, create an 1AM role named BU_ROLE with a policy that givesthe role access to the DynamoDB table and a trust policy to trust a specific role in theinventory application account. In the inventory account, create a role named APP_ROLEthat allows access to the STS AssumeRole API operation. Configure the application to useAPP_ROLE and assume the cross-account role BU_ROLE to read the DynamoDB table.
D. Integrate DynamoDB with AWS Certificate Manager (ACM). Generate identitycertificates to authenticate DynamoDB. Configure the application to use the correctcertificate to authenticate and read the DynamoDB table.



Question # 9

A company built an application with Docker containers and needs to run the application inthe AWS Cloud The company wants to use a managed sen/ice to host the applicationThe solution must scale in and out appropriately according to demand on the individualcontainer services The solution also must not result in additional operational overhead orinfrastructure to manageWhich solutions will meet these requirements? (Select TWO)

A. Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate.
B. Use Amazon Elastic Kubernetes Service (Amazon EKS) with AWS Fargate.
C. Provision an Amazon API Gateway API Connect the API to AWS Lambda to run the containers.
D. Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 worker nodes.
E. Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 workernodes.



Question # 10

A company uses Amazon S3 as its data lake. The company has a new partner that mustuse SFTP to upload data files A solutions architect needs to implement a highly availableSFTP solution that minimizes operational overhead.Which solution will meet these requirements?

A. Use AWS Transfer Family to configure an SFTP-enabled server with a publiclyaccessible endpoint Choose the S3 data lake as the destination
B. Use Amazon S3 File Gateway as an SFTP server Expose the S3 File Gateway endpointURL to the new partner Share the S3 File Gateway endpoint with the newpartner
C. Launch an Amazon EC2 instance in a private subnet in a VPC. Instruct the new partnerto upload files to the EC2 instance by using a VPN. Run a cron job script on the EC2instance to upload files to the S3 data lake
D. Launch Amazon EC2 instances in a private subnet in a VPC. Place a Network LoadBalancer (NLB) in front of the EC2 instances. Create an SFTP listener port for the NLB Share the NLB hostname with the new partner Run a cron job script on the EC2 instancesto upload files to the S3 data lake.



Question # 11

A company hosts an application used to upload files to an Amazon S3 bucket Onceuploaded, the files are processed to extract metadata which takes less than 5 seconds Thevolume and frequency of the uploads varies from a few files each hour to hundreds ofconcurrent uploads The company has asked a solutions architect to design a cost-effectivearchitecture that will meet these requirements.What should the solutions architect recommend?

A. Configure AWS CloudTrail trails to tog S3 API calls Use AWS AppSync to process thefiles.
B. Configure an object-created event notification within the S3 bucket to invoke an AWSLambda function to process the files.
C. Configure Amazon Kinesis Data Streams to process and send data to Amazon S3.Invoke an AWS Lambda function to process the files.
D. Configure an Amazon Simple Notification Service (Amazon SNS) topic to process thefiles uploaded to Amazon S3 Invoke an AWS Lambda function to process the files.



Question # 12

A company runs analytics software on Amazon EC2 instances The software accepts jobrequests from users to process data that has been uploaded to Amazon S3 Users reportthat some submitted data is not being processed Amazon CloudWatch reveals that theEC2 instances have a consistent CPU utilization at or near 100% The company wants toimprove system performance and scale the system based on user load.What should a solutions architect do to meet these requirements?

A. Create a copy of the instance Place all instances behind an Application Load Balancer
B. Create an S3 VPC endpoint for Amazon S3 Update the software to reference theendpoint
C. Stop the EC2 instances. Modify the instance type to one with a more powerful CPU andmore memory. Restart the instances.
D. Route incoming requests to Amazon Simple Queue Service (Amazon SQS) Configurean EC2 Auto Scaling group based on queue size Update the software to read from the queue.



Question # 13

A company is deploying an application that processes streaming data in near-real time Thecompany plans to use Amazon EC2 instances for the workload The network architecturemust be configurable to provide the lowest possible latency between nodesWhich combination of network solutions will meet these requirements? (Select TWO)

A. Enable and configure enhanced networking on each EC2 instance
B. Group the EC2 instances in separate accounts
C. Run the EC2 instances in a cluster placement group
D. Attach multiple elastic network interfaces to each EC2 instance
E. Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.



Question # 14

A company runs a container application on a Kubernetes cluster in the company's datacenter The application uses Advanced Message Queuing Protocol (AMQP) tocommunicate with a message queue The data center cannot scale fast enough to meet thecompany's expanding business needs The company wants to migrate the workloads toAWSWhich solution will meet these requirements with the LEAST operational overhead? \

A. Migrate the container application to Amazon Elastic Container Service (Amazon ECS)Use Amazon Simple Queue Service (Amazon SQS) to retrieve the messages.
B. Migrate the container application to Amazon Elastic Kubernetes Service (Amazon EKS)Use Amazon MQ to retrieve the messages.
C. Use highly available Amazon EC2 instances to run the application Use Amazon MQ toretrieve the messages.
D. Use AWS Lambda functions to run the application Use Amazon Simple Queue Service(Amazon SQS) to retrieve the messages.



Question # 15

A company runs a real-time data ingestion solution on AWS. The solution consists of themost recent version of Amazon Managed Streaming for Apache Kafka (Amazon MSK). Thesolution is deployed in a VPC in private subnets across three Availability Zones.A solutions architect needs to redesign the data ingestion solution to be publicly availableover the internet. The data in transit must also be encrypted.Which solution will meet these requirements with the MOST operational efficiency?

A. Configure public subnets in the existing VPC. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
B. Create a new VPC that has public subnets. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
C. Deploy an Application Load Balancer (ALB) that uses private subnets. Configure an ALBsecurity group inbound rule to allow inbound traffic from the VPC CIDR block for HTTPSprotocol.
D. Deploy a Network Load Balancer (NLB) that uses private subnets. Configure an NLBlistener for HTTPS communication over the internet.



Question # 16

A company runs a Java-based job on an Amazon EC2 instance. The job runs every hourand takes 10 seconds to run. The job runs on a scheduled interval and consumes 1 GB ofmemory. The CPU utilization of the instance is low except for short surges during which thejob uses the maximum CPU available. The company wants to optimize the costs to run thejob.Which solution will meet these requirements?

A. Use AWS App2Container (A2C) to containerize the job. Run the job as an AmazonElastic Container Service (Amazon ECS) task on AWS Fargate with 0.5 virtual CPU(vCPU) and 1 GB of memory.
B. Copy the code into an AWS Lambda function that has 1 GB of memory. Create anAmazon EventBridge scheduled rule to run the code each hour.
C. Use AWS App2Container (A2C) to containerize the job. Install the container in theexisting Amazon Machine Image (AMI). Ensure that the schedule stops the container whenthe task finishes.
D. Configure the existing schedule to stop the EC2 instance at the completion of the joband restart the EC2 instance when the next job starts.



Question # 17

An ecommerce company runs applications in AWS accounts that are part of anorganization in AWS Organizations The applications run on Amazon Aurora PostgreSQLdatabases across all the accounts The company needs to prevent malicious activity andmust identify abnormal failed and incomplete login attempts to the databasesWhich solution will meet these requirements in the MOST operationally efficient way?

A. Attach service control policies (SCPs) to the root of the organization to identify the failedlogin attempts
B. Enable the Amazon RDS Protection feature in Amazon GuardDuty for the memberaccounts of the organization
C. Publish the Aurora general logs to a log group in Amazon CloudWatch Logs Export thelog data to a central Amazon S3 bucket
D. Publish all the Aurora PostgreSQL database events in AWS CloudTrail to a centralAmazon S3 bucket



Question # 18

A company needs to provide customers with secure access to its data. The companyprocesses customer data and stores the results in an Amazon S3 bucket.All the data is subject to strong regulations and security requirements. The data must beencrypted at rest. Each customer must be able to access only their data from their AWSaccount. Company employees must not be able to access the data.Which solution will meet these requirements?

A. Provision an AWS Certificate Manager (ACM) certificate for each customer. Encrypt thedata client-side. In the private certificate policy, deny access to the certificate for allprincipals except an 1AM role that the customer provides.
B. Provision a separate AWS Key Management Service (AWS KMS) key for eachcustomer. Encrypt the data server-side. In the S3 bucket policy, deny decryption of data forall principals except an 1AM role that the customer provides.
C. Provision a separate AWS Key Management Service (AWS KMS) key for eachcustomer. Encrypt the data server-side. In each KMS key policy, deny decryption of datafor all principals except an 1AM role that the customer provides.
D. Provision an AWS Certificate Manager (ACM) certificate for each customer. Encrypt thedata client-side. In the public certificate policy, deny access to the certificate for allprincipals except an 1AM role that the customer provides.



Question # 19

A company has a nightly batch processing routine that analyzes report files that an onpremisesfile system receives daily through SFTP. The company wants to move thesolution to the AWS Cloud. The solution must be highly available and resilient. The solutionalso must minimize operational effort.Which solution meets these requirements?

A. Deploy AWS Transfer for SFTP and an Amazon Elastic File System (Amazon EFS) filesystem for storage. Use an Amazon EC2 instance in an Auto Scaling group with ascheduled scaling policy to run the batch operation.
B. Deploy an Amazon EC2 instance that runs Linux and an SFTP service. Use an AmazonElastic Block Store {Amazon EBS) volume for storage. Use an Auto Scaling group with theminimum number of instances and desired number of instances set to 1.
C. Deploy an Amazon EC2 instance that runs Linux and an SFTP service. Use an AmazonElastic File System (Amazon EFS) file system for storage. Use an Auto Scaling group withthe minimum number of instances and desired number of instances set to 1.
D. Deploy AWS Transfer for SFTP and an Amazon S3 bucket for storage. Modify theapplication to pull the batch files from Amazon S3 to an Amazon EC2 instance forprocessing. Use an EC2 instance in an Auto Scaling group with a scheduled scaling policyto run the batch operation.



Question # 20

A company uses high concurrency AWS Lambda functions to process a constantlyincreasing number of messages in a message queue during marketing events. TheLambda functions use CPU intensive code to process the messages. The company wantsto reduce the compute costs and to maintain service latency for its customers.Which solution will meet these requirements?

A. Configure reserved concurrency for the Lambda functions. Decrease the memoryallocated to the Lambda functions.
B. Configure reserved concurrency for the Lambda functions. Increase the memoryaccording to AWS Compute Optimizer recommendations.
C. Configure provisioned concurrency for the Lambda functions. Decrease the memoryallocated to the Lambda functions.
D. Configure provisioned concurrency for the Lambda functions. Increase the memoryaccording to AWS Compute Optimizer recommendations.



Question # 21

A company runs applications on AWS that connect to the company's Amazon RDSdatabase. The applications scale on weekends and at peak times of the year. Thecompany wants to scale the database more effectively for its applications that connect tothe database.Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon DynamoDB with connection pooling with a target group configuration forthe database. Change the applications to use the DynamoDB endpoint.
B. Use Amazon RDS Proxy with a target group for the database. Change the applicationsto use the RDS Proxy endpoint.
C. Use a custom proxy that runs on Amazon EC2 as an intermediary to the database.Change the applications to use the custom proxy endpoint.
D. Use an AWS Lambda function to provide connection pooling with a target groupconfiguration for the database. Change the applications to use the Lambda function.



Question # 22

A company wants to run its payment application on AWS The application receives paymentnotifications from mobile devices Payment notifications require a basic validation beforethey are sent for further processingThe backend processing application is long running and requires compute and memory tobe adjusted The company does not want to manage the infrastructureWhich solution will meet these requirements with the LEAST operational overhead?

A. Create an Amazon Simple Queue Service (Amazon SQS) queue Integrate the queuewith an Amazon EventBndge rule to receive payment notifications from mobile devicesConfigure the rule to validate payment notifications and send the notifications to the backend application Deploy the backend application on Amazon Elastic KubernetesService (Amazon EKS) Anywhere Create a standalone cluster
B. Create an Amazon API Gateway API Integrate the API with anAWS Step Functionsstate machine to receive payment notifications from mobile devices Invoke the statemachine to validate payment notifications and send the notifications to the backendapplication Deploy the backend application on Amazon Elastic Kubernetes Sen/ice(Amazon EKS). Configure an EKS cluster with self-managed nodes.
C. Create an Amazon Simple Queue Sen/ice (Amazon SQS) queue Integrate the queuewith an Amazon EventBridge rule to receive payment notifications from mobile devicesConfigure the rule to validate payment notifications and send the notifications to thebackend application Deploy the backend application on Amazon EC2 Spot InstancesConfigure a Spot Fleet with a default allocation strategy.
D. Create an Amazon API Gateway API Integrate the API with AWS Lambda to receivepayment notifications from mobile devices Invoke a Lambda function to validate paymentnotifications and send the notifications to the backend application Deploy the backendapplication on Amazon Elastic Container Service (Amazon ECS). Configure Amazon ECSwith an AWS Fargate launch type.



Question # 23

A company has multiple AWS accounts with applications deployed in the us-west-2 RegionApplication logs are stored within Amazon S3 buckets in each account The company wants to build a centralized log analysis solution that uses a single S3 bucket Logs must not leaveus-west-2, and the company wants to incur minimal operational overheadWhich solution meets these requirements and is MOST cost-effective?

A. Create an S3 Lifecycle policy that copies the objects from one of the application S3buckets to the centralized S3 bucket
B. Use S3 Same-Region Replication to replicate logs from the S3 buckets to another S3bucket in us-west-2 Use this S3 bucket for log analysis.
C. Write a script that uses the PutObject API operation every day to copy the entirecontents of the buckets to another S3 bucket in us-west-2 Use this S3 bucket for loganalysis.
D. Write AWS Lambda functions in these accounts that are triggered every time logs aredelivered to the S3 buckets (s3 ObjectCreated a event) Copy the logs to another S3 bucketin us-west-2. Use this S3 bucket for log analysis.



Question # 24

A company runs a highly available web application on Amazon EC2 instances behind anApplication Load Balancer The company uses Amazon CloudWatch metricsAs the traffic to the web application Increases, some EC2 instances become overloadedwith many outstanding requests The CloudWatch metrics show that the number of requestsprocessed and the time to receive the responses from some EC2 instances are both highercompared to other EC2 instances The company does not want new requests to beforwarded to the EC2 instances that are already overloaded.Which solution will meet these requirements?

A. Use the round robin routing algorithm based on the RequestCountPerTarget and ActiveConnection Count CloudWatch metrics.
B. Use the least outstanding requests algorithm based on the RequestCountPerTarget andActiveConnectionCount CloudWatch metrics.
C. Use the round robin routing algorithm based on the RequestCount andTargetResponseTime CloudWatch metrics.
D. Use the least outstanding requests algorithm based on the RequestCount andTargetResponseTime CloudWatch metrics.



Question # 25

An analytics company uses Amazon VPC to run its multi-tier services. The company wantsto use RESTful APIs to offer a web analytics service to millions of users. Users must beverified by using an authentication service to access the APIs.Which solution will meet these requirements with the MOST operational efficiency?

A. Configure an Amazon Cognito user pool for user authentication. Implement Amazon APIGateway REST APIs with a Cognito authorizer.
B. Configure an Amazon Cognito identity pool for user authentication. Implement AmazonAPI Gateway HTTP APIs with a Cognito authorizer.
C. Configure an AWS Lambda function to handle user authentication. Implement AmazonAPI Gateway REST APIs with a Lambda authorizer.
D. Configure an 1AM user to handle user authentication. Implement Amazon API GatewayHTTP APIs with an 1AM authorizer.



Question # 26

A company has an AWS Direct Connect connection from its on-premises location to anAWS account The AWS account has 30 different VPCs in the same AWS Region TheVPCs use private virtual interfaces (VIFs) Each VPC has a CIDR block that does notoverlap with other networks under the company's controlThe company wants to centrally manage the networking architecture while still allowingeach VPC to communicate with all other VPCs and on-premises networksWhich solution will meet these requirements with the LEAST amount of operationaloverhead?

A. Create a transit gateway and associate the Direct Connect connection with a new transitVIF Turn on the transit gateway's route propagation feature
B. Create a Direct Connect gateway Recreate the private VIFs to use the new gatewayAssociate each VPC by creating new virtual private gateways
C. Create a transit VPC Connect the Direct Connect connection to the transit VPC Create apeenng connection between all other VPCs in the Region Update the route tables
D. Create AWS Site-to-Site VPN connections from on premises to each VPC Ensure thatboth VPN tunnels are UP for each connection Turn on the route propagation feature



Question # 27

A solutions architect is designing a shared storage solution for a web application that isdeployed across multiple Availability Zones The web application runs on Amazon EC2instances that are in an Auto Scaling group The company plans to make frequent changesto the content The solution must have strong consistency in returning the new content assoon as the changes occur.Which solutions meet these requirements? (Select TWO)

A. Use AWS Storage Gateway Volume Gateway Internet Small Computer SystemsInterface (iSCSI) block storage that is mounted to the individual EC2 instances
B. Create an Amazon Elastic File System (Amazon EFS) file system Mount the EFS filesystem on the individual EC2 instances
C. Create a shared Amazon Elastic Block Store (Amazon EBS) volume. Mount the EBSvolume on the individual EC2 instances.
D. Use AWS DataSync to perform continuous synchronization of data between EC2 hostsin the Auto Scaling group
E. Create an Amazon S3 bucket to store the web content Set the metadata for the Cache-Control header to no-cache Use Amazon CloudFront to deliver the content



Question # 28

A company needs to extract the names of ingredients from recipe records that are storedas text files in an Amazon S3 bucket A web application will use the ingredient names toquery an Amazon DynamoDB table and determine a nutrition score.The application can handle non-food records and errors The company does not have anyemployees who have machine learning knowledge to develop this solutionWhich solution will meet these requirements MOST cost-effectively?

A. Use S3 Event Notifications to invoke an AWS Lambda function when PutObjectrequests occur Program the Lambda function to analyze the object and extract theingredient names by using Amazon Comprehend Store the Amazon Comprehend output inthe DynamoDB table.
B. Use an Amazon EventBridge rule to invoke an AWS Lambda function when PutObjectrequests occur. Program the Lambda function to analyze the object by using AmazonForecast to extract the ingredient names Store the Forecast output in the DynamoDB table.
C. Use S3 Event Notifications to invoke an AWS Lambda function when PutObjectrequests occur Use Amazon Polly to create audio recordings of the recipe records. Savethe audio files in the S3 bucket Use Amazon Simple Notification Service (Amazon SNS) tosend a URL as a message to employees Instruct the employees to listen to the audio filesand calculate the nutrition score Store the ingredient names in the DynamoDB table.
D. Use an Amazon EventBridge rule to invoke an AWS Lambda function when a PutObjectrequest occurs Program the Lambda function to analyze the object and extract theingredient names by using Amazon SageMaker Store the inference output from theSageMaker endpoint in the DynamoDB table.



Question # 29

A company has a new mobile app. Anywhere in the world, users can see local news ontopics they choose. Users also can post photos and videos from inside the app.Users access content often in the first minutes after the content is posted. New contentquickly replaces older content, and then the older content disappears. The local nature ofthe news means that users consume 90% of the content within the AWS Region where it isuploaded.Which solution will optimize the user experience by providing the LOWEST latency forcontent uploads?

A. Upload and store content in Amazon S3. Use Amazon CloudFront for the uploads.
B. Upload and store content in Amazon S3. Use S3 Transfer Acceleration for the uploads.
C. Upload content to Amazon EC2 instances in the Region that is closest to the user. Copythe data to Amazon S3.
D. Upload and store content in Amazon S3 in the Region that is closest to the user. Usemultiple distributions of Amazon CloudFront.



Question # 30

An ecommerce application uses a PostgreSQL database that runs on an Amazon EC2instance. During a monthly sales event, database usage increases and causes databaseconnection issues for the application. The traffic is unpredictable for subsequent monthlysales events, which impacts the sales forecast. The company needs to maintainperformance when there is an unpredictable increase in traffic.Which solution resolves this issue in the MOST cost-effective way?

A. Migrate the PostgreSQL database to Amazon Aurora Serverless v2.
B. Enable auto scaling for the PostgreSQL database on the EC2 instance to accommodateincreased usage.
C. Migrate the PostgreSQL database to Amazon RDS for PostgreSQL with a largerinstance type
D. Migrate the PostgreSQL database to Amazon Redshift to accommodate increasedusage



Question # 31

A company's marketing data is uploaded from multiple sources to an Amazon S3 bucket A series ot data preparation jobs aggregate the data for reporting The data preparation jobsneed to run at regular intervals in parallel A few jobs need to run in a specific order laterThe company wants to remove the operational overhead of job error handling retry logic,and state managementWhich solution will meet these requirements?

A. Use an AWS Lambda function to process the data as soon as the data is uploaded tothe S3 bucket Invoke Other Lambda functions at regularly scheduled intervals
B. Use Amazon Athena to process the data Use Amazon EventBndge Scheduler to invokeAthena on a regular internal
C. Use AWS Glue DataBrew to process the data Use an AWS Step Functions statemachine to run the DataBrew data preparation jobs
D. Use AWS Data Pipeline to process the data. Schedule Data Pipeline to process the dataonce at midnight.



Question # 32

A research company uses on-premises devices to generate data for analysis. Thecompany wants to use the AWS Cloud to analyze the data. The devices generate .csv filesand support writing the data to SMB file share. Company analysts must be able to use SQLcommands to query the data. The analysts will run queries periodically throughout the day.Which combination of steps will meet these requirements MOST cost-effectively? (SelectTHREE.)

A. Deploy an AWS Storage Gateway on premises in Amazon S3 File Gateway mode.
B. Deploy an AWS Storage Gateway on premises in Amazon FSx File Gateway mode.
C. Set up an AWS Glue crawler to create a table based on the data that is in Amazon S3.
D. Set up an Amazon EMR cluster with EMR Fife System (EMRFS) to query the data thatis in Amazon S3. Provide access to analysts.
E. Set up an Amazon Redshift cluster to query the data that is in Amazon S3. Provideaccess to analysts.
F. Set up Amazon Athena to query the data that is in Amazon S3. Provide access toanalysts.



Question # 33

A company website hosted on Amazon EC2 instances processes classified data stored inThe application writes data to Amazon Elastic Block Store (Amazon EBS) volumes Thecompany needs to ensure that all data that is written to the EBS volumes is encrypted atrest.Which solution will meet this requirement?

A. Create an 1AM role that specifies EBS encryption Attach the role to the EC2 instances
B. Create the EBS volumes as encrypted volumes Attach the EBS volumes to the EC2instances
C. Create an EC2 instance tag that has a key of Encrypt and a value of True Tag allinstances that require encryption at the EBS level
D. Create an AWS Key Management Service (AWS KMS) key policy that enforces EBSencryption in the account Ensure that the key policy is active



Question # 34

A company has Amazon EC2 instances that run nightly batch jobs to process data. TheEC2 instances run in an Auto Scaling group that uses On-Demand billing. If a job fails onone instance: another instance will reprocess the job. The batch jobs run between 12:00AM and 06 00 AM local time every day.Which solution will provide EC2 instances to meet these requirements MOST cost-effectively'?

A. Purchase a 1-year Savings Plan for Amazon EC2 that covers the instance family of theAuto Scaling group that the batch job uses.
B. Purchase a 1-year Reserved Instance for the specific instance type and operatingsystem of the instances in the Auto Scaling group that the batch job uses.
C. Create a new launch template for the Auto Scaling group Set the instances to SpotInstances Set a policy to scale out based on CPU usage.
D. Create a new launch template for the Auto Scaling group Increase the instance size Seta policy to scale out based on CPU usage.



Question # 35

A company hosts a three-tier web application in the AWS Cloud. A Multi-AZ Amazon RDSfor MySQL server forms the database layer. Amazon ElastiCache forms the cache layer.The company wants a caching strategy that adds or updates data in the cache when acustomer adds an item to the database. The data in the cache must always match the datain the database.Which solution will meet these requirements?

A. Implement the lazy loading caching strategy
B. Implement the write-through caching strategy.
C. Implement the adding TTL caching strategy.
D. Implement the AWS AppConfig caching strategy.



Question # 36

A company wants to analyze and troubleshoot Access Denied errors and Unauthonzederrors that are related to 1AM permissions The company has AWS CloudTrail turned onWhich solution will meet these requirements with the LEAST effort?

A. Use AWS Glue and write custom scripts to query CloudTrail logs for the errors
B. Use AWS Batch and write custom scripts to query CloudTrail logs for the errors
C. Search CloudTrail logs with Amazon Athena queries to identify the errors
D. Search CloudTrail logs with Amazon QuickSight. Create a dashboard to identify the errors.



Question # 37

A global company runs its applications in multiple AWS accounts in AWS Organizations.The company's applications use multipart uploads to upload data to multiple Amazon S3buckets across AWS Regions. The company wants to report on incomplete multipartuploads for cost compliance purposes.Which solution will meet these requirements with the LEAST operational overhead?

A. Configure AWS Config with a rule to report the incomplete multipart upload object count.
B. Create a service control policy (SCP) to report the incomplete multipart upload objectcount.
C. Configure S3 Storage Lens to report the incomplete multipart upload object count.
D. Create an S3 Multi-Region Access Point to report the incomplete multipart upload objectcount.



Question # 38

A company has stored 10 TB of log files in Apache Parquet format in an Amazon S3 bucketThe company occasionally needs to use SQL to analyze the log files Which solution willmeet these requirements MOST cost-effectively?

A. Create an Amazon Aurora MySQL database Migrate the data from the S3 bucket intoAurora by using AWS Database Migration Service (AWS DMS) Issue SQL statements tothe Aurora database.
B. Create an Amazon Redshift cluster Use Redshift Spectrum to run SQL statementsdirectly on the data in the S3 bucket
C. Create an AWS Glue crawler to store and retrieve table metadata from the S3 bucketUse Amazon Athena to run SQL statements directly on the data in the S3 bucket
D. Create an Amazon EMR cluster Use Apache Spark SQL to run SQL statements directlyon the data in the S3 bucket



Question # 39

A pharmaceutical company is developing a new drug. The volume of data that the company generates has grown exponentially over the past few months. The company'sresearchers regularly require a subset of the entire dataset to be immediately available withminimal lag. However the entire dataset does not need to be accessed on a daily basis. Allthe data currently resides in on-premises storage arrays, and the company wants to reduceongoing capital expenses.Which storage solution should a solutions architect recommend to meet theserequirements?

A. Run AWS DataSync as a scheduled cron job to migrate the data to an Amazon S3bucket on an ongoing basis.
B. Deploy an AWS Storage Gateway file gateway with an Amazon S3 bucket as the targetstorage Migrate the data to the Storage Gateway appliance.
C. Deploy an AWS Storage Gateway volume gateway with cached volumes with anAmazon S3 bucket as the target storage. Migrate the data to the Storage Gatewayappliance.
D. Configure an AWS Site-to-Site VPN connection from the on-premises environment toAWS. Migrate data to an Amazon Elastic File System (Amazon EFS) file system.



Question # 40

A company runs a three-tier web application in a VPC across multiple Availability Zones.Amazon EC2 instances run in an Auto Scaling group for the application tier.The company needs to make an automated scaling plan that will analyze each resource'sdaily and weekly historical workload trends. The configuration must scale resourcesappropriately according to both the forecast and live changes in utilization.Which scaling strategy should a solutions architect recommend to meet theserequirements?

A. Implement dynamic scaling with step scaling based on average CPU utilization from theEC2 instances.
B. Enable predictive scaling to forecast and scale. Configure dynamic scaling with targettracking.
C. Create an automated scheduled scaling action based on the traffic patterns of the webapplication.
D. Set up a simple scaling policy. Increase the cooldown period based on the EC2 instancestartup time



Question # 41

A company deployed a serverless application that uses Amazon DynamoDB as a databaselayer The application has experienced a large increase in users. The company wants toimprove database response time from milliseconds to microseconds and to cache requeststo the database.Which solution will meet these requirements with the LEAST operational overhead?

A. Use DynamoDB Accelerator (DAX).
B. Migrate the database to Amazon Redshift.
C. Migrate the database to Amazon RDS.
D. Use Amazon ElastiCache for Redis.



Question # 42

An online video game company must maintain ultra-low latency for its game servers. Thegame servers run on Amazon EC2 instances. The company needs a solution that canhandle millions of UDP internet traffic requests each second.Which solution will meet these requirements MOST cost-effectively?

A. Configure an Application Load Balancer with the required protocol and ports for theinternet traffic. Specify the EC2 instances as the targets.
B. Configure a Gateway Load Balancer for the internet traffic. Specify the EC2 instances asthe targets.
C. Configure a Network Load Balancer with the required protocol and ports for the internettraffic. Specify the EC2 instances as the targets.
D. Launch an identical set of game servers on EC2 instances in separate AWS Regions. Route internet traffic to both sets of EC2 instances.



Question # 43

A company maintains an Amazon RDS database that maps users to cost centers. Thecompany has accounts in an organization in AWS Organizations. The company needs asolution that will tag all resources that are created in a specific AWS account in theorganization. The solution must tag each resource with the cost center ID of the user whocreated the resource.Which solution will meet these requirements?

A. Move the specific AWS account to a new organizational unit (OU) in Organizations fromthe management account. Create a service control policy (SCP) that requires all existingresources to have the correct cost center tag before the resources are created. Apply the SCP to the new OU.
B. Create an AWS Lambda function to tag the resources after the Lambda function looksup the appropriate cost center from the RDS database. Configure an Amazon EventBridgerule that reacts to AWS CloudTrail events to invoke the Lambda function.
C. Create an AWS CloudFormation stack to deploy an AWS Lambda function. Configurethe Lambda function to look up the appropriate cost center from the RDS database and totag resources. Create an Amazon EventBridge scheduled rule to invoke theCloudFormation stack.
D. Create an AWS Lambda function to tag the resources with a default value. Configure anAmazon EventBridge rule that reacts to AWS CloudTrail events to invoke the Lambdafunction when a resource is missing the cost center tag.



Question # 44

A company is designing a tightly coupled high performance computing (HPC) environmentin the AWS Cloud The company needs to include features that will optimize the HPCenvironment for networking and storage.Which combination of solutions will meet these requirements? (Select TWO )

A. Create an accelerator in AWS Global Accelerator. Configure custom routing for theaccelerator.
B. Create an Amazon FSx for Lustre file system. Configure the file system with scratchstorage.
C. Create an Amazon CloudFront distribution. Configure the viewer protocol policy to beHTTP and HTTPS.
D. Launch Amazon EC2 instances. Attach an Elastic Fabric Adapter (EFA) to theinstances.
E. Create an AWS Elastic Beanstalk deployment to manage the environment.



Question # 45

A company is running a photo hosting service in the us-east-1 Region. The service enablesusers across multiple countries to upload and view photos. Some photos are heavilyviewed for months, and others are viewed for less than a week. The application allowsuploads of up to 20 MB for each photo. The service uses the photo metadata to determinewhich photos to display to each user.Which solution provides the appropriate user access MOST cost-effectively?

A. Store the photos in Amazon DynamoDB. Turn on DynamoDB Accelerator (DAX) tocache frequently viewed items.
B. Store the photos in the Amazon S3 Intelligent-Tiering storage class. Store the photometadata and its S3 location in DynamoDB.
C. Store the photos in the Amazon S3 Standard storage class. Set up an S3 Lifecyclepolicy to move photos older than 30 days to the S3 Standard-Infrequent Access (S3Standard-IA) storage class. Use the object tags to keep track of metadata.
D. Store the photos in the Amazon S3 Glacier storage class. Set up an S3 Lifecycle policyto move photos older than 30 days to the S3 Glacier Deep Archive storage class. Store thephoto metadata and its S3 location in Amazon OpenSearch Service.



Question # 46

A company is designing a new web application that will run on Amazon EC2 Instances. Theapplication will use Amazon DynamoDB for backend data storage. The application trafficwill be unpredictable. T company expects that the application read and write throughput tothe database will be moderate to high. The company needs to scale in response toapplication traffic.Which DynamoDB table configuration will meet these requirements MOST cost-effectively?

A. Configure DynamoDB with provisioned read and write by using the DynamoDBStandard table class. Set DynamoDB auto scaling to a maximum defined capacity.
B. Configure DynamoDB in on-demand mode by using the DynamoDB Standard tableclass.
C. Configure DynamoDB with provisioned read and write by using the DynamoDBStandard Infrequent Access (DynamoDB Standard-IA) table class. Set DynamoDB autoscaling to a maximum defined capacity.
D. Configure DynamoDB in on-demand mode by using the DynamoDB Standard InfrequentAccess (DynamoDB Standard-IA) table class.



Question # 47

A company's web application that is hosted in the AWS Cloud recently increased inpopularity. The web application currently exists on a single Amazon EC2 instance in asingle public subnet. The web application has not been able to meet the demand of theincreased web traffic.The company needs a solution that will provide high availability and scalability to meet theincreased user demand without rewriting the web application.Which combination of steps will meet these requirements? (Select TWO.)

A. Replace the EC2 instance with a larger compute optimized instance.
B. Configure Amazon EC2 Auto Scaling with multiple Availability Zones in private subnets.
C. Configure a NAT gateway in a public subnet to handle web requests.
D. Replace the EC2 instance with a larger memory optimized instance.
E. Configure an Application Load Balancer in a public subnet to distribute web traffic



Question # 48

A company is designing a web application on AWS The application will use a VPNconnection between the company's existing data centers and the company's VPCs. Thecompany uses Amazon Route 53 as its DNS service. The application must use privateDNS records to communicate with the on-premises services from a VPC. Which solutionwill meet these requirements in the MOST secure manner?

A. Create a Route 53 Resolver outbound endpoint. Create a resolver rule. Associate theresolver rule with the VPC
B. Create a Route 53 Resolver inbound endpoint. Create a resolver rule. Associate theresolver rule with the VPC.
C. Create a Route 53 private hosted zone. Associate the private hosted zone with the VPC.
D. Create a Route 53 public hosted zone. Create a record for each service to allow servicecommunication.




Related Exams


Our Clients Say About Amazon SAA-C03 Exam