Labour Day Sale - Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 575363r9

Welcome To DumpsPedia

C_DS_42 Sample Questions Answers

Questions 4

You are asked to perform either the initial load or delta load based on the value of a variable that is set at job execution

How do you design this requirement in SAP Data services.

Options:

A.

Set the job to call the initial and delta dataflow in parallel. Each dataflow should have a filter testing for the variable value.

B.

Use a job containing a script with the if then else () function to test the variable value. Connect this script to the initial and delta u dataflow.

C.

Use a job containing a Case transform testing for the two possible conditions. Connect one case output to the initial dataflow and the other to the delta dataflow Use a job container the delta dataflow

D.

Use a job containing a Conditional objet that test the value of the variable. In the IF part, call the initial dataflow in the ELSE part call

Buy Now
Questions 5

What is the relationship between local variables and parameters in SAP Data Services? 2 correct answers

Options:

A.

a local variable in a workflow sets the value of an a parameter in a dataflow.

B.

3 local variable in a job sets the value of an a parameter in a workflow

C.

a parameter in a workflow sets the value of a local variable in a dataflow

D.

a parameter in a job set the value of a local variable in a dataflow

Buy Now
Questions 6

A dataflow contain a pivot transform followed by a query transform that performs an aggregation. The Aggregation query should by pushed down to the database in SAP Data services.

Where would you place the Data_Transfer transform to do this?

Options:

A.

Before the pivot transform

B.

Between the pivot transform and the query transform

C.

After the query transform

D.

Before the pivot transform and after the query transform.

Buy Now
Questions 7

How do you desing a data load that has good performance and deals with interrupted loads in SAP Data services?

Options:

A.

by setting the target table loader with Bulk Load and Auto Correct Load enabled.

B.

by setting the target table loader with bulk load enabled

C.

By using the table comparison transform

D.

By creating two dataflows and executing the Auto Correct Load version when reired

Buy Now
Questions 8

In SAP Data services which function delivers the same results an nested 1FTHENELSE functions?

Options:

A.

Match_Pattern

B.

Decode

C.

Literal

D.

Match_regex

Buy Now
Questions 9

You decide to distribute the execution of a job across multiple job servers within a server group. What distribution levels are available? 3 answers correct

Options:

A.

workflow

B.

subdataflow

C.

JOB

D.

Dataflow

E.

Embedded dataflow

Buy Now
Questions 10

An SAP Data Services dataflow must load the source table data into a target table, but the column names are different. Where do you assigg each source column to the matching target column?

Options:

A.

In the table reader

B.

In a table loader

C.

In a query transform

D.

In the Map transform

Buy Now
Questions 11

You want to load data from an input table to an output table using the SAP Data Services Query transform. How do you define the mapping of the columns within a Query transform?

Options:

A.

Drag one column from the input schema to the output schema

B.

Select an output column and enter the mapping manually.

C.

Drag one column from the output schema to the input schema

D.

Select one input column and enter the mapping manually

Buy Now
Questions 12

You SAP Data Services job design includes an initialization script that truncates rows in the target prior to loading, the job uses automatic recovery

How would you expect the system to behave when you run the job in recovery mode?

Note: There are 2 correct answers to this question

Options:

A.

The job executes the scripts if it is part of a workflow marked as a recovery unit, but only if an error was raised

B.

The job executes the scripts if it is part of a workflow marked as a recovery unit irrespective of where the error ocurred in the job flow.

C.

the job starts with the flow that caused the error. If this flow is after the initialization script the initialization script is skipped.

D.

The job reruns all workflows and scripts. When using automatic recovery, only dataflows that ran successfully in the previous execution ^ are skipped.

Buy Now
Status:
Expired
Exam Code: C_DS_42
Exam Name: SAP Certified Application Associate - Data Integration with SAP Data Services 4.2
Last Update: Mar 22, 2024
Questions: 80
$64  $159.99
$48  $119.99
$40  $99.99
buy now C_DS_42