Browse SAP Questions
Study all 100 questions at your own pace with detailed explanations
Total: 100 questionsPage: 2 of 10
Question 11 of 100Multiple Choice
To follow the new security compliances your company has hired an external auditor to assess the security perimeter around your SaaS platform. The application is running in multiple regions and uses the load balancers within each regions for higher availability. The instances loads sensitive configurations from an S3 bucket at start and the DynamoDB is used as primary database. The auditor has advised to further tighten the security groups and NACLs based on the application requirement and use the private network instead of using the public endpoints to access the AWS services. Your team decided to use the VPC Endpoints as it uses the AWS internal network for all the communication, after detailed examination they realised the current architecture will not allow them to use the VPC endpoints as it is and will require a set of modifications. What modifications would be needed to align the architecture? (Select THREE)
AConfigure the DynamoDB Global Tables to replicate the data into multi-regions
BCreate VPC Endpoints for S3 and DynamoDB and modify the route tables for all the availability zones used by the auto scaling group
CUse the NAT Gateway for all the egress communication to these AWS services
DSetup VPC gateway endpoint for S3 and interface endpoint for DynamoDB to communicate with these services over the private AWS network
EUse the S3 Cross Region Replication to save the configurations in the multiple regions
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 12 of 100
You're building a mobile application game. The application needs permissions for each user to communicate and store data in DynamoDB tables. What is the best method for granting each mobile device that installs your application to access DynamoDB tables for storage when required? Choose the correct answer from the options below
ADuring the install and game configuration process, have each user create an IAM credential and assign the IAM user to a group with proper permissions to communicate with DynamoDB.
BCreate an IAM group that only gives access to your application and to the DynamoDB tables. Then, when writing to DynamoDB, simply include the unique device ID to associate the data with that specific user.
CCreate an IAM role with the proper permission policy to communicate with the DynamoDB table. Use web identity federation, which assumes the IAM role using AssumeRoleWithWebldentity, when the user signs in, granting temporary security credentials using STS.
DCreate an Active Directory server and an AD user for each mobile application user. When the user signs in to the AD sign-on, allow the AD server to federate using SAML 2.0 to IAM and assign a role to the AD user which is the assumed with AssumeRoleWithSAML.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 13 of 100
You have an application running on an EC2 Instance, which will allow users to download flies from a private S3 bucket using a pre-signed URL. Before generating the URL the application should verify the existence of the file in S3. How should the application use AWS credentials to access the S3 bucket securely?
AUse the AWS account access keys the application retrieves the credentials from the source code of the application.
BCreate a IAM user for the application with permissions that allow list access to the S3 bucket launch the instance as the IAM user and retrieve the IAM user’s credentials from the EC2 instance user data.
CCreate an IAM role for EC2 that allows list access to objects in the S3 bucket. Launch the instance with the role, and retrieve the role’s credentials from the EC2 Instance metadata
DCreate an IAM user for the application with permissions that allow list access to the S3 bucket. The application retrieves the IAM user credentials from a temporary directory with permissions that allow read access only to the application user.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 14 of 100
One of your work colleagues has just left and you have been handed some of the infrastructure he set up. In one of the setups you start looking at, he has created multiple components of a single application and all the components are hosted on a single EC2 instance (without an ELB) in a VPC. You have been told that this needs to be set up with two separate SSLs for each component. Which of the following would best achieve the setting up off the two separate SSLs while using still only using one EC2 instance? Choose the correct answer:
ACreate an EC2 instance which has multiple network interfaces with multiple elastic IP addresses.
BCreate an EC2 instance which has both an ACL and the security group attached to it and have separate rules for each IP address.
CCreate an EC2 instance which has multiple subnets attached to it and each will have a separate IP address.
DCreate an EC2 instance with a NAT address.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 15 of 100
Your new client has asked you to evaluate their current Disaster Recovery scheme. An employee who left the company was in charge of their disaster recovery. The employee left very little documentation on DR, but you have been assured that there is a DR scheme in place. After viewing their AWS account you find what you believe is the DR configuration in a second region. You compare the resources in this region to the resources in the region that contains the production environment. Some of the things you found include: A minimal version of the RDS database, several Elastic IP addresses, several AMIs with specific names which associate them with your production Web Servers, and a CloudFormation template containing all of the remaining resources (other than the RDS database) in your environment. Which DR strategy is most likely being used here?
ABackup and restore
BPilot Light
CMulti-Site
DWarm Standby
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 16 of 100
A media production company wants to deliver high-definition raw video for preproduction and dubbing to customer all around the world. They would like to use Amazon CloudFront for their scenario, and they require the ability to limit downloads per customer and video file to a configurable number. A CloudFront download distribution with TTL=0 was already setup to make sure all client HTTP requests hit an authentication backend on Amazon Elastic Compute Cloud (EC2)/Amazon RDS first, which is responsible for restricting the number of downloads. Content is stored in S3 and configured to be accessible only via CloudFront. What else needs to be done to achieve an architecture that meets the requirements? Choose 2 answers
AEnable URL parameter forwarding, let the authentication backend count the number of downloads per customer in RDS, and return the content of S3 URL unless the download limit is reached.
BEnable CloudFront logging into an S3 bucket, leverage EMR to analyze CloudFront logs to determine the number of downloads per customer, and return the content S3 URL unless the download limit is reached.
CEnable URL parameter forwarding, let the authentication backend count the number of downloads per customer in RDS, and invalidate the CloudFront distribution as soon as the download limit is reached.
DEnable CloudFront logging into the S3 bucket, let the authentication backend determine the number of downloads per customer by parsing those logs, and return the content S3 URL unless the download limit is reached.
EConfigure a list of trusted signers, let the authentication backend count the number of download requests per customer in RDS, and return a dynamically signed URL unless the download limit is reached.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 17 of 100
A company is running a MySQL RDS instance inside of AWS; however, a new requirement for disaster recovery is keeping a read replica of the production RDS instance in an on-premise data center. What is the securest way of performing this replication? Choose the correct answer from the options below
AConfigure the RDS instance as the master and enable replication over the open internet using a secure SSL endpoint to the on-premise server.
BRDS cannot replicate to an on-premise database server. Instead, first configure the RDS instance to replicate to an EC2 instance with core MySQL, and then configure replication over a secure VPN VPG connection.
CCreate a Data Pipeline that exports the MySQL data each night and securely downloads the data from an S3 HTTPS endpoint.
DCreate an IPSec VPN connection using either OpenVPN or VPN/WGW through the Virtual Private Cloud service and enable replication from AWS RDS to on-premises database instance
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 18 of 100
Your company has recently extended its datacenter into a VPC on AWS to add burst computing capacity as needed Members of your Network Operations Center need to be able to go to the AWS Management Console and administer Amazon EC2 instances as necessary. You don’t want to create new IAM users for each NOC member and make those users sign in again to the AWS Management Console. Which option below will meet the needs for your NOC members?
AUse OAuth 2.0 to retrieve temporary AWS security credentials to enable your NOC members to sign in to the AVVS Management Console.
BUse web Identity Federation to retrieve AWS temporary security credentials to enable your NOC members to sign in to the AWS Management Console.
CUse your on-premises SAML 2.0 compliant identity provider (IDP) to grant the NOC members federated access to the AWS Management Console via the AWS single sign-on (SSO) endpoint
DUse your on-premises SAML 2.0 compliant identity provider (IDP) to retrieve temporary security credentials to enable NOC members to sign in to the AWS Management Console
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 19 of 100
Your company sells consumer devices and needs to record the first activation of all sold devices. Devices are not activated until the information is written on a persistent database. Activation data is very important for your company and must be analyzed daily with a MapReduce job. The execution time of the data analysis process must be less than three hours per day. Devices are usually sold evenly during the year, but when a new device model is out, there is a predictable peak in activation's, that is, for a few days there are 10 times or even 100 times more activation's than in average day. Which of the following databases and analysis framework would you implement to better optimize costs and performance for this workload?
AAmazon RDS and Amazon Elastic MapReduce with Spot instances.
BAmazon DynamoDB and Amazon Elastic MapReduce with Spot instances.
CAmazon RDS and Amazon Elastic MapReduce with Reserved instances.
DAmazon DynamoDB and Amazon Elastic MapReduce with Reserved instances
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 20 of 100
A Company is hosting an Nginx web application. They want to use EMR to create EMR jobs that shift through all of the web server logs and error logs to pull statistics on click stream and errors based off of client IP address. Given the requirements what would be the best method for collecting the log data and analyzing it automatically? Choose the correct answer:
AConfigure ELB error logs then create a Data Pipeline job which imports the logs from an S3 bucket into EMR for analyzing and outputs the EMR data into a new S3 bucket.
BIf the application is using HTTP, configure proxy protocol to pass the client IP address in a new HTTP header. If the application is using TCP, modify the application code to pull the client IP into the x-forward-for header so the web servers can parse it.
CConfigure ELB access logs then create a Data Pipeline job which imports the logs from an S3 bucket into EMR for analyzing and output the EMR data into a new S3 bucket.
DIf the application is using TCP, configure proxy protocol to pass the client IP address in a new TCP header. If the application is using, HTTP modify the application code to pull the client IP into the x-forward-for header so the web servers can parse it.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation