Saturday , July 27 2024

Cloud Pak for Integration Essentials cognitive class Exam Answers:-

Course Name :- Cloud Pak for Integration Essentials

Question 1 :- Every organization, regardless of size or complexity NEEDS and USES APIs for which of the following reasons.

  • Gain access to databases and backend applications for reuse
  • Re-use existing apps
  • allow front-end and back-end to develop applications in parallel
  • All of the above

Question 2: The Gateway Instance of API Connect is responsible for all of the following EXCEPT:

  • API Policy enforcement
  • Self-service App Developer Portal
  • Traffic Control and Mediation
  • Monitoring/Analytics Collection

Question 3 : What is the most important reason why would use API Gateway gateways over Commodity Gateways?

  • API Gateway gateways are simple and secure- delivered as a hardened Docker Container, while Commodity Gateways are insecure and hard to manage
  • API Gateway gateways provide convenient integration between the rest of the API Connect platform
  • There is no reason- Commodity Gateways can be easily swapped out with API Gateway gateways on the API Connect platform
  • API Gateway gateways come with additional features not available on most Commodity Gateways

Question 4 : Which of the following is not an advantage of managing APIs through an API Lifecycle?

  • Evolve API definitions continuously
  • Align with DevOps practices
  • Manage and Control API Lifecycle and versioning from staging to deprecation to meet corporate governance needs
  • Export APIs into OpenAPI spec for each import into third-party tools

Question 5 : The benefit of defining global policies in API Connect is:

  • Enforce corporate standards and centralize governance with common security and logging policies without impacting API development
  • Flexible security permissions for deploying global policies, enabling CSO / security professionals additional access control and flexibility
  • Accelerated definition as it follows the same assembly constructs of that within the API
  • All of the above

Question 6 : All of the following are easily accomplished when using IBM MQ messaging except:

  • Making messaging available within each environment to assure local access
  • Guaranteed messaging ordering when using Queue Manager clusters
  • At most once delivery
  • Seamlessly handling temporary network disruptions between environments
  • Provide a reliable communication channel for applications running in different clouds

Question 7 : All of the following are easily accomplished when using Apache Kafka except:

  • Each topic can be replicated onto multiple brokers
  • Time based retention of messages
  • In a consumer group, each partition is consumed by single member of the group
  • A producer can prioritize speed or reliability by choosing the level of acknowledgement
  • Automatic deletion of messages when read

Question 8 : In Apache Kafka all of the following are features of replication except:

  • Followers repeatedly fetch messages from the leader
  • Any in sync follower can become the leader without messages loss
  • A follower replica is considered in-sync when it has the same set of messages as the leader
  • Producers must wait for at an acknowledgement from at least one broker to successfully send a message to a topic
  • If brokers write messages to disk more frequently that the default setting , it will ciase the overall throughput to decrease

Question 9 :  When using Kafka all of the following can help to reverse a growing offset lag except:

  • Optimize consumers to spend less time processing messages
  • Increase the number of consumers
  • Increase the retention time
  • Change the producer acknowledgement setting from “leader ack” to “all ack”
  • Have consumers poll for messages more frequently

Question 10 : All are advantages of IBM MQ’s Two Phase Commit(TPC) support for databases except:

  • A retry of a failed 2PC transaction can be kicked off automatically w/o specific application action.
  • The data shared between the databases and the MQ Queue will not become out of sync
  • After a specified number of failures a message can be automatically diverted to a dead letter queue
  • If the database fails to respond during a TPC transaction MQ will delete the message from the queue and raise an error
  • Messages are only destroyed after being successfully consumed and then committed to a database

Question 11 : If you have an existing application with a customer object stored in a databse, what would be the quickest way to start using Salesforce contacts as your app’s source of customer data?

  • Rewrite your application to use a customer object modeled after the fields in a Salesforce contact and call the Salesforec APIs directly
  • Write your own microservice to map your customer objects fields to the fields in a Salesforce contact
  • Deploy a Master Data Management system to map the your customer data to Salesforce
  • Create an Open API definiton of your customer data manangement ops (create, retrive, update and delete ) and map it to an event driven flow in App Connect triggered by Salesforce contact events
  • Use database triggers to send customer database operations to a Kafka cluster and then write an app to feed those into Salesforce.

Question 12 : All of the following are benefits of creating a model for a URL driven flow except:

  • Can map the calling app’s data model in App Connect to multiple systems (Salesforce, ServiceNow etc ) w/o modifying the calling app.
  • Can modify the calling app’s data model in App Connect to fit the data models of various backends w/o writing any code
  • Can map responses from external systems back into a format already understood by the calling app.
  • Can easily be incorporated into an OpenAPI definition of the URL driven flows endpoint making it easy for developrs to understand and interact with the flow.
  • Makes the flow more secure as the calling app could encrypt the data without sharing it’s keys before passing it in to the URL driven flow.

Question 13 : All of the following are true about App Connect Designer except:

  • It enables rapid prototyping
  • No coding skills required to use it
  • It generates source code for all the integration flows you build
  • You can add you own API definitions to the palette
  • New connectors are added regularly expanding the number of systems you can use in your flows

Question 14 : All of the following are true about App Connect Dashboard except:

  • It allows you to deploy integartion flows as Kubernetes pods
  • It generates an exectutable file from a .bar file that you can export and run on any compatible platform.
  • You can provide different app credentials for a flow deployment than the ones used by the flow devloper
  • You can deploy flows developed with AppConnect Toolkit and the cloud version of App Connect Designer
  • For heavily used deployments you can scale horizontally by modifying the deployment parameters

Question 15: A legacy JEE application is being retired by an enterprise. It has a UI and some business logic called internally with the app. The business logic uses a database that the company is still using. They woud like to build app workflows around this logic, several other API driven systems in their company, and multiple external services like Salesforce and ServiceNow. What is the most efficient way to proceed?

  • Create a thin API layer on top of the business logic and import that into App Connect Designer as an API, along with the other APIs the company uses.
  • Rewite the app using a more modern UI and deploy it as a standalone application.
  • Bundle the business logic into a Spring Boot app and add new logic in the app as Spring Boot Java APIs
  • Use a code generator to translate the Java business logic into Node.js
  • Rewrite the business logic from scratch as set of microservices

About Machine Learning

Check Also

Python for Data Science Cognitive Class Exam Answers:-

Course Name:- Python for Data Science Module 1. Python Basics Question 1. What is the …

Leave a Reply

Your email address will not be published. Required fields are marked *