In my previous post, Oracle Autonomous Database: Accessing & Using Database Actions , I covered the topic of Database Actions available with Oracle Autonomous Database (and how to access them!).
As a reminder, Database Actions is a web-based interface which leverages ORDS (Oracle REST Data Services) to provide a rich set of features for development, data studio, administration and monitoring Oracle Autonomous Database services. This is relevant for both Oracle Autonomous Data Warehouse (ADW) and Oracle Autonomous Transaction Processing (ATP) service instances. At a high-level, features of “database actions” include execution of SQL statements/scripts, the creation of data modeler diagrams, developing RESTful web services, managing JSON collections, using Data Load/Catalog/Insight features and data transformation tools. See further down this post for a full list and description of available features.
In this post we will look at the data load feature of “database actions” including how to access this feature and how to use it! These data loading features are really simple to use allowing you to efficiently create tables or update existing tables with data that you might have stored locally or in the cloud, all without writing any code.
Note 1 – The details in this post, assume that you already have an Oracle Cloud free tier (or upgraded!) account. If you don’t you can get one here.
Note 2 – The details below assume that you already have a provisioned & active Oracle Autonomous Database service instance. If you do not, you may wish to follow the steps in my posts: Creating an ATP Database in Oracle Cloud & Connecting to an ATP Database in Oracle Cloud
Note 3 – Given the regular release schedule present in Oracle Cloud, it is possible that the screens may change somewhat beyond the writing of this post. If this happens, please comment on this post and I will try to help you out. (and update this post!)
The Data Load capability in the Oracle Autonomous Database, Database Actions, provides 3 key capabilities:
- Load data from a local file into database tables within Oracle Autonomous Database
- Link database tables within Oracle Autonomous Database to files which reside on the cloud in object storage or database tables which are held within other instances of Oracle Autonomous Database using database links – new data is only accessed when it is needed
- It is only possible to link data using database or cloud storage (e.g, local files is not possible)
- Feed data into database tables within Oracle Autonomous Database when new data is loaded into object storage – new data is loaded automatically
- It is only possible to feed data using cloud storage (e.g, database linking and local files is not possible)
Accessing the Data Load feature
Step 1: Log into your Oracle Cloud account (free tier is suitable)
Step 2: Click on the menu icon (often referred to as “Hamburger” or “Pancake Stack” icon)
Step 3: Select “Oracle Database” and then select “Autonomous Database”
Step 4: Click on the database instance you will use – In my case I will use “ASG_ATP_DB” which is an ATP instance.
Step 5: Click “Database Actions”
Step 6: You will be presented with the Database Actions Launchpad. By default, this will open as the “admin” user that was provisioned when you created your instance. Click on “Data Load” underneath the “Data Studio” sub-section.
You will be presented with the data load feature.
Load Data From Local File
To load data from a file stored on you local machine follow the below steps:
Step 1: Select “Load Data”
Step 2: Click “Local File” and then “Next”
Step 3: You will presented with a file load page. You can either drag and drop files to the palette or use the “Select Files” button to browse your local machine for a file
Step 4: As I chose to browse my local machine for a file to upload, I am presented with windows file explorer and have navigated to the file I wish to upload. Note – this is just a sample CSV file downloaded from here.
Step 5: Once the file has uploaded, the following screen is presented. Here you can see move about the file by clicking on the file name “Financials Sample Data.csv”
This screen provides the following “pre-data load” options:
- Alter data loading option by choosing one of the following option:
- Create table: This will create a brand new table based on the table name given and load the data in the file into it
- Insert into table: This will try to insert the data from the file into an existing database table, retaining any existing data that is already present in the database table
- Replace data: This will remove the existing data from an existing database table and insert the data from the file
- Drop table and create new table : If the database table already exists, this option will drop the table (and the data contained in the table), recreate the table and then load the data from the file into it
- Alter the table name (when creating a new table) or Select an existing table name (when performing other actions)
- Define CSV related options:
- Text enclosure
- Field delimiter
- Select to confirm the row with column headers
- Select the row in which to process data from
The tool itself examines the file that has been uploaded and presents recommended data types (and it does a really good job of this too). When you have finished reviewing/amending the options, click “Close”.
Step 6: Now you are happy with the pre-load checks, you can hit the “start” button to begin the data load. The jobs will be sent to the Autonomous Database job scheduler and run in the background.
You can see in the below screenshot that the table does not currently exist in the database.
Step 7: Confirm that you wish to run the data load job by clicking the “Run” button
Step 8: When the job has completed, you will see the number of rows successfully loaded. Notice how the total time is only 2 seconds to load this data!
You can see in the below screenshot that the table now exists in the database with the data items loaded from the CSV file.
You can see in the below screenshot that the data load tool actual produced the code needed to upload this data which you could save and use in your own scripts if needed.