Home / SAP Data Services (page 5)

SAP Data Services

New Feature of BODS 4.2 and BODS 4.1

This Document is to provide information on some of the new features of SAP Data Services 4.2 and 4.1 which could be useful while working on the DS. New Features of BODS 4.1: The major improvement of the 4.1 version is its integration with SAP and NON SAP systems. Enhanced …

Read More »

Connect Data Services 4.2 SP4 With Mainframe ADABAS

Good afternoon, I would like to develop an ETL using SAP Data Services 4.2 SP4 connecting the ADABAS mainframe. In the product documentation mentions: 2.5.2.1 Mainframe interface The software Provides the Attunity Connector datastore que accesses mainframe data sources through Attunity Connect. The data sources que Attunity Connect accesses are …

Read More »

Case Transform Example

The Case Transform is used to route one row to different outputs depending on conditions. Specifies multiple paths in a single transform (different rows are processed in different ways). The Case transform simplifies branch logic in data flows by consolidating case or decision making logic in one transform. Paths are …

Read More »

SAP Data Services System & Performance Optimization

The following post is an attempt to summarize the Data Services Performance Optimization Guidelines and make them shorter and easier to read. In order to achieve an improvement I suggest you to stick to the following sequence: ‘measure’, ‘act’, ‘measure’. Before moving to actual DS optimization it is recommended to …

Read More »

Adding Jobs Under projects in Easy way

We can add the jobs under projects by dragging them into project folder(From object library to project area) But if we have hundreds of jobs its painful to add each job by dragging into project folder. so by using below steps we can add the jobs under projects at a …

Read More »

Row Per Commit & Array Fetch Size in BODS

This Article will provide the information on Row Per Commit and Array Fetch Size. These two BODS Performance techniques improve the job execution time in practical scenario. Row per commit : For the best performance, BODS recommends to set the Rows per commit value between 500 and 2000. The default …

Read More »

History Preserving with precise timestamps

SAP Data Services’ HistoryPreserving transform does a good job to reduce complexity of history preserving when loading data into a data warehouse. However, it has the limitation that ValidFrom and ValidTo columns can only be a date – not a timestamp. So to allow for history preserving of intra-day changes, …

Read More »

Export substitution parameters to CSV

Why do we need substitution parameters in excel? 😕 In designer, we see substitution parameters in a grid view. Whereas, when we have to export it, we will only have XML and ATL as option. These are not the straight forward information for humans to understand. If there is a …

Read More »

How to apply Permanant License Key in BODS

This document describes the process to apply license in BODS system with screenshot.However it is not a substitute of the main administration guide which is available in Market place. License Manager  After initial BODS installation system runs on a trial license for 90 days.After that we have to go for …

Read More »

How to extract data from different file types using Wait_For_File().

Requirement was to process various master data mapping files such chart-of-account and Sub-Department and also process transactional Monthly General Ledger and Monthly Sales data files. We accomplished this task by building a batch job with Wait_For_File and Conditional workflows. Chart_of_Account: FileName_COA_MAP.TXT SubDepartment: FileName_CC_MAP.TXT GL File: FileName_GL.TXT Sales File: FileName_Sales.TXT wait_for_file(‘FolderName/COA_Map.csv’,0,0,1,$FileType); …

Read More »

Address Cleanse Best Practices for India

Address Cleanse Best Practices for India Input Format The global address cleanse transform deployed within Data Services or DQM SDK supports a wide variety of input fields that can be mapped to an incoming data source.  The input fields can be categorized as multiline, discrete or hybrid. Multiline fields can …

Read More »

SAP Data Services 4.2 SP05 Upgrade steps

Purpose: The purpose of this document is to up-grade SAP Business Objects Data Services from 11.7/3.X/4.X to SAP Data Services 4.2 SP05 Overview: Environment Details: Operating system: Windows Server 2008 64 Bit Database: Microsoft SQL Server 2012 R2 Web Application: Tomcat SAP Business Objects Tools: SAP Business Objects Information Platform …

Read More »

Conditional execution of a job

Hi Team, It’s a basic requirement but most common .So I wanted to share it for beginners. Purpose : Job should execute for a selected country .If there are no records for the selected country then job should terminate . Create 2 global variables to at Job level $CNTR : …

Read More »

Data_Transfer

I am have doubt in DATA_TRANSFER transform Here transfer type is ( AUTOMATIC , FILE & DATABASE) . i selected FILE and providing required directory and required file.  Unfortunately in file i am providing my source file. After execution of that JOB my source file is disappeared. Could you please …

Read More »

Mask Functionality in DS 4.2 SP 04

Intro In my post SAP Data Services 4.2 SP4 New Features I highlighted some new functionality introduced in Data Services 4.2 SP04. One of the new functionality was the ability for Data Services to mask data. This blog focuses on how you can mask data. The Data Mask transform enables …

Read More »

Data Services Best Practices Migration Strategies

Migration Strategies Data Services supports two methods to move code in the development cycle: export/import and using central repository. Export/import method can be used for a small project, while central repository is better suited for a medium to large project team. 1. Migration Using Export/Import Export/Import is the basic method …

Read More »