BlackLine Data Targets

Overview

DataBlend supports the following data targets for loading data to BlackLine:

  • Account Balances

  • Subledger Balances

  • Import Definitions (BlackLine Data Sources)

Configuration

Setting

Required/ Optional

Description

Setting

Required/ Optional

Description

Query

Required

Select the query that will generate the data for this target.
See the relevant target type documentation for additional details on what query columns are required for your target.

Query Mode

Required

Select New unless you have a reason to select Latest or Specific.

Credential

Required

Select a Blackline credential from the list.

A BlackLine target requires a credential with the DataIngestionAPI scope.

Target Type

Required

Select Account, Subledger, or ImportDefinition.

Wait for Results

Optional

If false, then the target is asynchronous; the status indicates whether the resultset was successfully delivered to BlackLine for processing. It does NOT indicate the status of the import. If this target is used in a workflow, then subsequent workflow steps will proceed whether or not the import has been completed.

If true, then the target is synchronous; DataBlend will wait for the BlackLine import job to complete and set the target status to reflect the final status of the import.

To learn more about BlackLine API requirements, please visit https://developer.blackline.com/apis.

Account Balances and Subledger Balances

What these targets do

From the BlackLine Accounts API documentation:

Imports Account Balances for Account Reconciliations and Variance Analysis and creates or updates Account Reconciliations for each Period. Provides additional information about each Account such as account type, description, references, and whether there has been activity in the period.

From the BlackLine Sub Ledger Balances documentation:

Imports data for Sub Ledger Balances in Account Reconciliations using the Sub Ledger template.

The data processing workflow

  1. The DataBlend target runs, sending data to BlackLine.

  2. BlackLine saves the data to a local file and creates an import job for the local file.

  3. BlackLine returns the import job id to DataBlend.

  4. If Wait for Results is false, the target will end successfully.

  5. If Wait for Results is true, the target will use the job id to request the status of the import from BlackLine. Any import errors will be reported to the user via the DataBlend target status.

The status of the import can be see in the BlackLine application under SystemJobsImport Status.

Configuring the query

BlackLine provides data import templates for both the Account balance and Sub Ledger balance imports. These are available from the BlackLine Community.

DataBlend queries to be used with these BlackLine imports should use the column headers specified in the relevant BlackLine import template, replacing any spaces with underscores (_). The casing of column names does not matter.

Some examples:

BlackLine Import Column Name

DataBlend Query Column Name

BlackLine Import Column Name

DataBlend Query Column Name

Entity Unique Identifier

Entity_Unique_Identifier

Period End Date

Period_End_Date

Key3

Key3

 

Import Definitions (BlackLine Data Sources)

Financial transactions are imported to BlackLine using Import Definitions. You can use an existing Import Definition if you know the necessary column headers and order, or you can create a new Import Definition.

If you create a new Import Definition, you will need to provide a JSON file with an example of the data which you will be importing. DataBlend can create this example file for you based on your query results.

Either the BlackLine Import Definition or the DataBlend Import Definition target can be created first.

The data processing workflow

Pre-requisites

  1. The BlackLine Import Definition must be scheduled to process any incoming data.

  2. The DataBlend Import Definition target must be scheduled to send query results to Blackline.

Then

  1. The DataBlend target runs, sending data to BlackLine.

  2. BlackLine saves the data to a local file.

  3. BlackLine returns the import job id to DataBlend.

  4. If Wait for Results is false, the target will end successfully.

  5. If Wait for Results is true, the target will use the job id to request the status of the import from BlackLine.

    1. The import job runs at its next scheduled time.

    2. Any import errors will be reported to the user via the DataBlend target status.

The status of the import can be seen in the BlackLine application under SystemJobsJob Status.

DataBlend BlackLine Import Definition Target

In addition to the general settings, the Import Definition target requires web api name of the BlackLine Import Definition. If the Import Definition has not yet been created, leave this field blank and complete it after the Import Definition has been created.

Once the target has been saved, a Download JSON File option will become available. This will generate a JSON file to be used when configuring the BlackLine Import Definition.

Configuring the Query

BlackLine does not provide a template for Import Definitions because Import Definitions can be customized to your needs.

Column name and order must match the Import Definition exactly. If they do not, BlackLine will report a 400 Bad Request error.

How to Create a BlackLine Import Definition

Create a new Import Definition and complete the fields.

Set the values at Step 1.

Field Name

Value

Field Name

Value

File Type

JSON

Import Name

Free-text identifier for the import job.

Web API Enabled

Set to true.
The Web API field will auto complete based on the Import Name.

Table Type

Matching Data Source

Data Source Name

Can be New or Existing

 

Upload an example JSON file for Step 2.

Select a JSON file which demonstrates the structure of the data to be imported. This can be generated by the DataBlend target, if necessary.

Select Upload.

 

Schedule the job to run as desired. The job MUST be enabled and running in order for DataBlend to contact this API end point. When the process runs it will collect any API submissions you have made to the definition.

Advanced

History Retention

History Retention (Days) allows users to decide how long they want the information from their data targets to be stored. This field is optional.

Timeout (seconds)

The Timeout section allows users to determine if they would like to timeout collections taking longer than a set number of seconds to collect data.

Skip If No Records Found

The “Skip if No Records Found” button is used to eliminate sending information to Data Targets unnecessarily. Simply enable the “Skip if No Records Found” toggle. The use of this toggle is optional.

Agent

The Agent drop-down for users to select any agent they have established. This is optional.

Run As

Run As allows users to select from a drop-down list of users to run the Workflow. This is optional. Please note that Run As is only available to Admin users. If a user is set as the Run As and then demoted to a Member, the user which demoted the Run As user will instead be set as the Run As.

Schedule & Presets        

Link to Schedule and Presents

Scheduling Components