1. Introduction
In this codelab, we have listed the steps to create a service account and use it to configure the Client Key and Communication Map table provided by ABAP SDK for Google Cloud and invoke the Cloud Storage JSON API in ABAP.
The following Google Cloud services are used in this codelab:
- Compute Engine
- Network Services
- Cloud Shell
- Cloud Storage JSON API V1
Note: When you create a new Google Cloud Project, a specific set of APIs and services, including Cloud Storage, are automatically enabled. This ensures that you can immediately take advantage of this robust storage solution. Therefore you need not enable it as an additional step.
Prerequisites
- Ensure that you have access to a SAP system with ABAP SDK for Google Cloud installed.
- You can refer to codelab " Install ABAP Platform Trial on Google Cloud Platform and Install ABAP SDK" to set up a new system.
What you'll build
You'll create the following programs in your SAP System using ABAP SDK for Google Cloud
- Create a Cloud Storage Bucket.
- Read a file from the application server and upload it to the created Cloud Storage Bucket.
2. Requirements
- A browser, such as Chrome or Firefox.
- A Google Cloud project with billing enabled or Create a 90-Day Free Trial account for Google Cloud Platform.
- SAP GUI (Windows or Java) installed in your system. If SAP GUI is already installed on your laptop, connect to SAP using the VM external IP address as the Application Server IP. If you are on Mac, then you can also install the SAP GUI for Java available in this link.
3. Before you begin
- Ensure that you have access to a SAP system with ABAP SDK for Google Cloud installed.
- You can refer to codelab Install ABAP Platform Trial on Google Cloud Platform and Install ABAP SDK to set up a new system.
- In the Google Cloud Console, on the project selector page, select or create a Google Cloud project (For example:
abap-sdk-poc).
- Make sure that billing is enabled for your Cloud project. Learn how to check if billing is enabled on a project. Skip this step if you are using the 90-Day Free Trial Account.
- You will use Cloud Shell, a command-line environment running in Google Cloud. From the Cloud Console, click Activate Cloud Shell on the top right corner:
- Run the following commands to authenticate for your account and set the default project to
abap-sdk-poc
. Zoneus-west4-b
is used as an example. If needed, please change the project and zone in the following commands based on your preference.
gcloud auth login
gcloud config set project abap-sdk-poc
gcloud config set compute/zone us-west4-b
- You must have access to an SAP system with the ABAP SDK for Google Cloud installed.
- You must complete codelab 1 (Install ABAP Platform Trial 1909 on Google Cloud Platform and Install ABAP SDK for Google Cloud) and codelab 2 (Configure ABAP SDK Authentication using tokens for SAP Hosted on Compute Engine VM) before proceeding with this codelab.
- If you have completed codelab 1 and codelab 2, this would have provisioned you with an ABAP Platform Trial 1909 System on Google Cloud, along with the required setup for authentication and connectivity.
- If you have not completed codelab 1 and codelab 2, you will not have all the required infrastructure and connectivity to perform the steps provided in this codelab. Therefore, you must complete codelab 1 and codelab 2 before proceeding with this codelab.
4. Create a Service Account with Storage Object User Role
To create a service account with required role, perform the following steps:
- Run the following command in the Cloud Shell terminal:
gcloud iam service-accounts create abap-sdk-storage-tester \
--display-name="Service Account for Cloud Storage"
- Now add the required roles to the service account created in the above step:
gcloud projects add-iam-policy-binding abap-sdk-poc \
--member='serviceAccount:abap-sdk-storage-tester@abap-sdk-poc.iam.gserviceaccount.com' \
--role='roles/storage.objectUser'
The above command uses abap-sdk-poc
as a placeholder for the Google Cloud Project. Replace abap-sdk-poc
with your project id.
- To verify, the role has been added, go to IAM page. The service account you created should be listed along with the role that has been assigned to it as shown below:
5. Create Client Key Configuration
Now that you have set up the pre-requisites on the Google Cloud side, we can move ahead with the configuration on the SAP side.
For authentication and connectivity related configuration, the ABAP SDK for Google Cloud uses the table /GOOG/CLIENT_KEY
To maintain the configuration in the table /GOOG/CLIENT_KEY table, perform the following steps:
- In the SAP GUI, enter transaction code
SPRO
. - Click SAP Reference IMG.
- Click ABAP SDK for Google Cloud > Basic Settings > Configure Client Key.
- Maintain the following values against the fields:
Field | Value |
Google Cloud Key Name | TEST_STORAGE |
Google Cloud Service Account Name |
|
Google Cloud Scope |
|
Project ID | abap-sdk-poc |
Authorization Class |
|
Leave all other fields blank
6. Create a Z-Report to Create a Bucket on Cloud Storage
- Log in to your SAP System.
- Go to transaction code
SE38
and create a Report Program with the nameZDEMO_CREATE_BUCKET.
- In the pop-up that opens up, provide details as shown below:
In the next pop-up either select Local Object or Provide a package name as per your choice.
- In the ABAP Editor, add the following code:
DATA lv_json_response TYPE string.
DATA ls_input TYPE /goog/cl_storage_v1=>ty_001.
DATA lo_storage TYPE REF TO /goog/cl_storage_v1.
TRY.
lo_storage = NEW #( iv_key_name = 'TEST_STORAGE' ).
" Bucket Name should be globally unique & permanent
ls_input = VALUE #( name = 'newtest_bucket_abapsdk_gcloud001' ).
lo_storage->insert_buckets
( EXPORTING iv_q_project = CONV #( lo_storage->gv_project_id )
is_input = ls_input
IMPORTING es_raw = lv_json_response
es_output = DATA(ls_output)
ev_ret_code = DATA(lv_ret_code)
ev_err_text = DATA(lv_err_text)
es_err_resp = DATA(ls_err_resp) ).
IF lo_storage->is_success( lv_ret_code ) = abap_true.
cl_demo_output=>new(
)->begin_section( 'Result:'
)->write_text( 'Bucket was created:'
)->next_section( 'JSON Response:'
)->write_json( lv_json_response
)->display( ).
ELSE.
DATA(lv_msg) = lv_ret_code && ':' && lv_err_text.
cl_demo_output=>new(
)->begin_section( 'Result:'
)->write_text( 'Bucket creation failed;'
)->next_section( 'Error:'
)->write_json( lv_msg
)->display( ).
ENDIF.
CATCH /goog/cx_sdk INTO DATA(lo_sdk_excp).
lv_msg = lo_sdk_excp->get_text( ).
MESSAGE lv_msg TYPE 'S' DISPLAY LIKE 'E'.
ENDTRY.
Please note, if the bucket name is not globally unique then the bucket will not be created, therefore try to use a unique name for the bucket before executing the code
- Save and activate the report.
- Execute the report (Press F8).
On successful execution you should see the report output as shown below:
7. Create a Z-Report to Read a File from Application Server and Upload it to Cloud Storage Bucket
Before you perform this activity, you need to prepare a large text file. Here we already have a large text file created and uploaded to our application server. You can use the transaction code CG3Z to upload a file to your SAP system's application server
For this example, we are using a text file of size ~40 MB, which is already uploaded to the Application Server in the /tmp
directory.
You can also download this folder from GitHub using the following link: Sample File
- Log in in to your SAP System
- Go to transaction code
SE38
and create a Report Program with the nameZDEMO_UPLOAD_FILE.
- In the pop-up that opens up, provide details as shown below:
In the next pop-up either select Local Object or Provide a package name as per your choice.
- In the ABAP Editor, add the following code:
DATA lv_file_length TYPE i.
DATA lv_msg TYPE string.
DATA lv_dset TYPE string.
DATA lv_data TYPE string.
DATA ls_data TYPE xstring.
DATA lo_storage TYPE REF TO /goog/cl_storage_v1.
" Read file data from the application server
DATA(dset) = '/tmp/sample_file.txt'.
OPEN DATASET dset FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc <> 0.
MESSAGE 'Cannot opening/reading dataset' TYPE 'E'.
ENDIF.
DO.
READ DATASET dset INTO lv_dset.
IF sy-subrc <> 0.
EXIT.
ENDIF.
CONCATENATE lv_data lv_dset INTO lv_data SEPARATED BY cl_abap_char_utilities=>newline.
CLEAR lv_dset.
ENDDO.
CLOSE DATASET dset.
CALL FUNCTION 'SCMS_STRING_TO_XSTRING'
EXPORTING text = lv_data
IMPORTING buffer = ls_data
EXCEPTIONS failed = 1
OTHERS = 2.
IF sy-subrc <> 0.
MESSAGE 'Conversion from string to xstring failed' TYPE 'E'.
ENDIF.
TRY.
lo_storage = NEW #( iv_key_name = 'DEMO_STORAGE' ).
lo_storage->add_common_qparam( iv_name = 'uploadType' iv_value = 'resumable' ).
lo_storage->insert_objects( EXPORTING iv_q_name = 'large_text_file_demo.txt'
iv_p_bucket = 'newtest_bucket_abapsdk_gcloud001'
is_data = ls_data
iv_content_type = 'text/pdf'
IMPORTING es_output = data(ls_output)
ev_ret_code = data(lv_ret_code)
ev_err_text = data(lv_err_text)
es_err_resp = data(ls_err_resp) ).
IF lo_storage->is_success( lv_ret_code ) = abap_true.
cl_demo_output=>new(
)->begin_section( 'Result:'
)->write_text( 'Object was uploaded successfully'
)->write_text( 'Object Self Link:'
)->write_text( ls_output-self_link
)->display( ).
ELSE.
cl_demo_output=>new(
)->begin_section( 'Error:'
)->write_text( lv_msg
)->display( ).
ENDIF.
CATCH /goog/cx_sdk INTO data(lo_sdk_excp).
lv_msg = lo_sdk_excp->get_text( ).
MESSAGE lv_msg TYPE 'S' DISPLAY LIKE 'E'.
RETURN.
ENDTRY.
lo_storage->close( ).
- Save and activate the report.
- Execute the report (Press F8).
On successful execution you should see the report output as shown below:
You can verify if the file has been successfully uploaded or not by navigating as shown below to your Cloud Storage Bucket
8. Upload file to Cloud Storage using ABAP SDK code explained
In essence, this ABAP program integrates with Google Cloud Storage. It reads a file from the application server and passes the file data to the API Client Stub of Storage API for uploading it to a storage bucket, which was created in the earlier report program.
The report program that you have created to upload a file does the following:
Step-by-Step Breakdown
Establish Connection:
- It establishes an HTTP connection to the Google Storage service using the
/GOOG/CL_STORAGE_V1
class.
Read File Data
- Reads the file on the application server by doing an
OPEN
DATASET
and then calls the standard SAP function moduleSCMS_STRING_TO_XSTRING
to convert toXSTRING
format.
Add Common Query Parameters
- To achieve chunking, the type of upload we need to choose is "Resumable" upload. By default when resumable upload is selected, the file data is divided into chunks of 8 MB and uploaded. Developers can however alter this chunking size by setting the parameter
IV_P_CHUNK_SIZE
, but it is recommended to use the default setting. - To let the API method know that "Resumable" upload option has to be chosen, we call the method
ADD_COMMON_QPARAM
and passuploadType
asresumable.
Insert Objects
- Calls the method
INSERT_OBJECTS
by passing the following parameters to it: IV_Q_NAME
: File name with which the contents are to be stored in Cloud StorageIV_P_BUCKET
: Bucket name where the file has to be uploadedIS_DATA
: File data that needs tobe uploadedIV_CONTENT_TYPE
: Content Type of the file, for our current scenario we are using "text/plain" as we are uploading a text file.
Note that we do not pass any value to the importing parameter IV_CHUNK_SIZE
and let the API Client Stub use the default value associated with this parameter, which is 8 MB.
Handle Success/Errors:
- Displays API response based on whether the API call was successful or not.
Close Connection:
- Closes the HTTP connection to the Storage Service.
9. Congratulations
Congratulations! You have successfully completed the "Uploading a File to Cloud Storage Bucket" Codelab.
Cloud Storage JSON API has a lot of capabilities and with ABAP SDK for Google Cloud, you can access them directly in your SAP systems natively using ABAP.
Google Cloud Storage is a great option for storing and managing large amounts of data. It is used by a wide range of businesses, enterprises and applications.
Some of the benefits of using Google Cloud Storage:
- Cost-effectiveness: Google Cloud Storage is a cost-effective way to store and manage large amounts of data.
- Simplicity: Google Cloud Storage is easy to use, with a simple and intuitive API.
- Flexibility: Google Cloud Storage can be used with a variety of applications and platforms.
You can now proceed with the below codelab to continue with your learning journey of using ABAP SDK to access various google Cloud Services.
- Send an event to Pub/Sub
- Receive an event from Cloud Pub/Sub
- Use Cloud Translation API to translate texts
- Use DLP API for PII redaction
- Call BigQuery ML from ABAP
10. Clean up
If you do not wish to continue with the additional codelabs related to ABAP SDK for Google Cloud, please proceed with the cleanup.
Delete the project
- Delete the Google Cloud project:
gcloud projects delete abap-sdk-poc
Delete individual resources
- Delete the compute instance:
gcloud compute instances delete abap-trial-docker
- Delete the firewall-rules:
gcloud compute firewall-rules delete sapmachine
- Delete the service account:
gcloud iam service-accounts delete \
abap-sdk-dev@abap-sdk-poc.iam.gserviceaccount.com