SAP HANA smart data integration

HANA SDI | Smart Data Integration 2.0 – H2H (HANAAdapter) Real-time Replication: Lessons Learned

Hi HANA EIM and SDI Community, In this blog entry I would like to convey some of the experiences we made throughout an SDI HANA to HANA (H2H) implementation project. To gather an understanding of the context, we will start off with the scenario description and solution architecture. These are the items that will be covered throughout this blog: Implementation Scope Solution Architecture Best Practices Challenges Reengineering of Replication Tasks …

Read More »

HANA SDI

Hi HANA EIM and SDI Community, In this blog entry I would like to convey some of the experiences we made throughout an SDI HANA to HANA (H2H) implementation project. To gather an understanding of the context, we will start off with the scenario description and solution architecture. These are the items that will be covered throughout this blog: Implementation Scope Solution Architecture Best Practices Challenges Reengineering of Replication Tasks …

Read More »

Salesforce.com adapter for SAP SDI: replication of Salesforce data into HANA tables.

1. Overview The Advantco SFDC adapter is an adapter for SAP HANA Smart Data Integration (SDI), its purpose is to batch load or to replicate changed data in real time from Salesforce.com to the SAP HANA tables. This blog describes in detail how to replicate Account data from Salesforce.com in HANA tables. 2. Adapter Architecture The SFDC Adapter acts as a bridge. It open a connection to the Salesforce and …

Read More »

SAP HANA Cloud Smart Data Integration for Real World Implementation Scenario

Motivation Due to the intense promotion of SAP HANA (Cloud) Platform Smart Data Integration, I see that the easy integration of multiple sources and data utilization in the cloud has been growing in the market and therefore is sufficient justification to go for this tendency more and more. For more information about it, Check it out 🙂 Key Aspects SAP HANA Smart Data Integration Motivated by this vision, I decided to …

Read More »

Triple H: Hadoop, Hive, HANA on Windows 10’s Bash Shell (Part-3)

Hello again, This will be my last post for this series. I will be showing here the setup of my SAP HXE 2.0 instance installed locally in my desktop. I will also describe here the steps i have performed in order to access my Hadoop/Hive database installed on my local Linux installation, Windows 10 Bash Shell via SAP Smart Data Access: Finally, SAP HANA Studio for the virtualization of remote tables …

Read More »

Triple H: Hadoop, Hive, HANA on Windows 10’s Bash Shell (Part-2)

As I promised on my previous post, I will be sharing here my Hadoop/Hive installation on my local machine using Windows 10 Bash Shell. I will show you my setup and versions of Hadoop and Hive. In the next blog, I will be showing my local SAP HANA Express Edition connectivity to Hadoop/Hive using SDA. To proceed here you will need to make sure you have Bash On Ubuntu installed …

Read More »

Task partitioning enhances initial load performance in HANA smart data integration

Many SAP HANA customers are using HANA smart data integration to simplify their data integration landscape to run real-time applications as announced in a previously published blog. Starting with SAP HANA Rev 122.04, HANA smart data integration introduced task partitioning in the Flowgraph Editor. Task partitioning helps our customers load large initial data sets faster and utilizes available memory from various supported sources into SAP HANA. While SAP HANA Rev …

Read More »

Triple H: Hadoop, Hive, HANA on Windows 10’s Bash Shell

This article is for those interested in learning how to install Apache™ Hadoop®, Apache Hive ™ data warehouse on Windows 10’s Bash Shell. Then connect SAP HANA with Apache Hive ™ through SAP HANA Studio using SDA (Smart Data Access). The goal here is to use Hadoop and Hive on a local Linux machine taking advanced of Windows 10 Anniversary Update as an alternative for Linux instance on the Cloud. …

Read More »

Using ‘Import’ inside a Procedure using SAP HANA SDI

This blog will show you how it is possible to use the IMPORT statement inside a Procedure using SAP HANA SDI. As you know it is currently not possible to use the IMPORT function inside a procedure using the standard HANA SQL script. This can be done by using SAP HANA Data Provisioning Agent Configuration. Once you have the DP Agent downloaded, open it. You should see a screen like …

Read More »