Browse SAP Questions

Study all 100 questions at your own pace with detailed explanations

Total: 100 questionsPage: 1 of 10
Question 1 of 100

A Company has two batch processing applications that consume financial data about the day's stock transactions. Each transaction needs to be stored durably and guarantee that a record of each application is delivered so the audit and billing batch processing applications can process the data. However, the two applications run separately and several hours apart and need access to the same transaction information. After reviewing the transaction information for the day, the information no longer needs to be stored. What is the best way to architect this application? Choose the correct answer from the options below

AUse SQS for storing the transaction messages. When the billing batch process consumes each message, have the application create an identical message and place it in a different SQS for the audit application to use several hours later.
BUse SQS for storing the transaction messages; when the billing batch process performs first and consumes the message, write the code in a way that does not remove the message after consumed, so it is available for the audit application several hours later. The audit application can consume the SQS message and remove it from the queue when completed.
CStore the transaction information in a DynamoDB table. The billing application can read the rows while the audit application will read the rows them remove the data.
DUse Kinesis to store the transaction information. The billing application will consume data from the stream, the audit application can consume the same data several hours later.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 2 of 100

Your company is migrating infrastructure to AWS. A large number of developers and administrators will need to control this infrastructure using the AWS Management Console. The Identity Management team is objecting to creating an entirely new directory of IAM users for all employees, and the employees are reluctant to commit yet another password to memory. Which of the following will satisfy both these stakeholders?

AUsers log in to the AWS Management Console using the AWS Command Line Interface.
BUsers request a SAML assertion from your on-premises SAML 2.0-compliant identity provider (IdP) and use that assertion to obtain federated access to the AWS Management Console via the AWS single sign-on (SSO) endpoint.
CUsers sign in using an OpenID Connect (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the AWS Management Console.
DUsers log in directly to the AWS Management Console using the credentials from your on-premises Kerberos compliant identity provider.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 3 of 100

The Marketing Director in your company asked you to create a mobile app that lets users post sightings of good deeds known as random acts of kindness in 80-character summaries. You decided to write the application in JavaScript so that it would run on the broadest range of phones, browsers, and tablets. Your application should provide access to Amazon DynamoDB to store the good deed summaries. Initial testing of a prototype shows that there aren’t large spikes in usage. Which option provides the most cost-effective and scalable architecture for this application?

AProvide the JavaScript client with temporary credentials from the Security Token Service using a Token Vending Machine (TVM) on an EC2 instance to provide signed credentials mapped to an Amazon Identity and Access Management (IAM) user allowing DynamoDB puts and S3 gets. You serve your mobile application out of an S3 bucket enabled as a web site. Your client updates DynamoDB.
BRegister the application with a Web Identity Provider like Amazon, Google, or Facebook, create an IAM role for that provider, and set up permissions for the IAM role to allow S3 gets and DynamoDB puts. You serve your mobile application out of an S3 bucket enabled as a web site. Your client updates DynamoDB.
CProvide the JavaScript client with temporary credentials from the Security Token Service using a Token Vending Machine (TVM) to provide signed credentials mapped to an IAM user allowing DynamoDB puts. You serve your mobile application out of Apache EC2 instances that are load-balanced and autoscaled. Your EC2 instances are configured with an IAM role that allows DynamoDB puts. Your server updates DynamoDB.
DRegister the JavaScript application with a Web Identity Provider like Amazon, Google, or Facebook, create an IAM role for that provider, and set up permissions for the IAM role to allow DynamoDB puts. You serve your mobile application out of Apache EC2 instances that are load-balanced and autoscaled. Your EC2 instances are configured with an IAM role that allows DynamoDB puts. Your server updates DynamoDB
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 4 of 100

You are designing the network infrastructure for an application server in Amazon VPC. Users will access all the application instances from the Internet as well as from an on-premises network. The on-premises network is connected to your VPC over an AWS Direct Connect link. How would you design routing to meet the above requirements?

AConfigure a single routing Table with a default route via the Internet gateway Propagate a default route via BGP on the AWS Direct Connect customer router Associate the routing table with all VPC subnets
BConfigure a single routing table with a default route via the internet gateway Propagate specific routes for the on-premises networks via BGP on the AWS Direct Connect customer router Associate the routing table with all VPC subnets.
CConfigure a single routing table with two default routes: one to the internet via an Internet gateway the other to the on-premises network via the VPN gateway use this routing table across all subnets in your VPC.
DConfigure two routing tables one that has a default route via the Internet gateway and another that has a default route via the VPN gateway Associate both routing tables with each VPC subnet.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 5 of 100

You are designing multi-region architecture and you want to send users to a geographic location based on latency-based routing, which seems simple enough; however, you also want to use weighted-based routing among resources within that region. Which of the below setups would best accomplish this? Choose the correct answer from the options below.

AYou will need to use complex routing (nested record sets) and ensure that you define the latency based records first.
BYou will need to use complex routing (nested record sets) and ensure that you define the weighted resource sets first
CYou will need to use AAAA - IPv6 addresses when you define the weighted based record sets
DThis cannot be done. You can't use different routing records together
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 6 of 100

Your company produces customer commissioned one-of-a-kind skiing helmets combining high fashion with custom technical enhancements. Customers can show off their Individuality on the ski slopes and have access to head-up-displays. GPS rear-view cams and any other technical innovation they wish to embed in the helmet. The current manufacturing process is data rich and complex including assessments to ensure that the custom electronics and materials used to assemble the helmets are to the highest standards. Assessments are a mixture of human and automated assessments you need to add a new set of assessment to model the failure modes of the custom electronics using GPUs with CUDA across a cluster of servers with low latency networking. What architecture would allow you to automate the existing process using a hybrid approach and ensure that the architecture can support the evolution of processes over time?

AUse AWS Data Pipeline to manage movement of data & meta-data and assessments Use an auto-scaling group of G2 instances in a placement group.
BUse Amazon Simple Workflow (SWF) to manage assessments, movement of data & meta-data. Use an autoscaling group of G2 instances in a placement group.
CUse Amazon Simple Workflow (SWF) to manage assessments, movement of data & meta-data. Use an autoscaling group of C3 instances with SR-IOV (Single Root I/O Virtualization).
DUse AWS data Pipeline to manage movement of data & meta-data and assessments use auto-scaling group of C3 with SR-IOV (Single Root I/O virtualization)
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 7 of 100

You are designing an intrusion detection prevention (IDS/IPS) solution for a customer web application in a single VPC. You are considering the options for implementing IDS/IPS protection for traffic coming from the Internet. Which of the following options would you consider? (Choose 2 answers)

AImplement IDS/IPS agents on each Instance running In VPC
BConfigure an instance in each subnet to switch its network interface card to promiscuous mode and analyze network traffic.
CImplement Elastic Load Balancing with SSL listeners In front of the web applications
DImplement a reverse proxy layer in front of web servers and configure IDS/IPS agents on each reverse proxy server
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 8 of 100

Your customer is willing to consolidate their log streams (access logs, application logs, security logs etc.) in one single system. Once consolidated, the customer wants to analyze these logs in real time based on heuristics. From time to time, the customer needs to validate heuristics, which requires going back to data samples extracted from the last 12 hours. What is the best approach to meet your customer’s requirements?

ASend all the log events to Amazon SQS. Setup an Auto Scaling group of EC2 servers to consume the logs and apply the heuristics.
BSend all the log events to Amazon Kinesis develop a client process to apply heuristics on the logs
CConfigure Amazon CloudTrail to receive custom logs, use EMR to apply heuristics the logs
DSetup an Auto Scaling group of EC2 syslogd servers, store the logs on S3 use EMR to apply heuristics on the logs
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 9 of 100Multiple Choice

A company has several departments and its AWS specialist created an AWS organization in its master account that is owned by the operation team. He invited other departments such as Development, QA and HR to join the organization. After all the invitations were accepted, the payer account and linked accounts have been setup successfully. Which benefits can this consolidated billing configuration bring to the organization? (Select TWO)

AIt becomes more secure as only the payer account can see the total usage and charges across all the accounts. Owners of the linked accounts can not see their usage and charges.
BThe usage across all accounts can share the volume pricing discounts and Reserved Instance discounts.
CConsolidated billing feature does not bring additional cost.
DAWS Organizations automatically creates a root user and an IAM role for all linked accounts so that the master account can accessing and administering the member accounts.
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Question 10 of 100

You are looking to migrate your Development (Dev) and Test environments to AWS. You have decided to use separate AWS accounts to host each environment. You plan to link each accounts bill to a Master AWS account using Consolidated Billing. To make sure you keep within budget you would like to implement a way for administrators in the Master account to have access to stop, delete and/or terminate resources in both the Dev and Test accounts. Identify which option will allow you to achieve this goal?

ACreate IAM users in the Master account with full Admin permissions. Create cross-account roles in the Dev and Test accounts that grant the Master account access to the resources in the account by inheriting permissions from the Master account.
BCreate IAM users and a cross-account role in the Master account that grants full Admin permissions to the Dev and Test accounts.
CCreate IAM users in the Master account Create cross-account roles in the Dev and Test accounts that have full Admin permissions and grant the Master account access
DLink the accounts using Consolidated Billing. This will give IAM users in the Master account access to resources in the Dev and Test accounts
💡 Try to answer first, then click "Show Answer" to see the correct answer and explanation
Showing 1-10 of 100 questions