import os
import shutil
import tiledb
import warnings
"ignore")
warnings.filterwarnings(import tiledb.sql
import numpy as np
# Print library versions
print("TileDB core version: {}".format(tiledb.libtiledb.version()))
print("TileDB-Py version: {}".format(tiledb.version()))
print("TileDB-SQL version: {}".format(tiledb.sql.version))
# Set table dataset URIs, and the URI to an example CSV
= "my_dense_table"
dense_table_uri = "my_sparse_table"
sparse_table_uri = (
example_csv_uri "s3://tiledb-inc-demo-data/examples/notebooks/nyc_yellow_tripdata/taxi_first_10.csv"
)
# Set configuration parameters.
= tiledb.Config({"vfs.s3.no_sign_request": "true", "vfs.s3.region": "us-east-1"})
cfg = tiledb.Ctx(cfg)
ctx
# Clean up the tables if they already exist
if os.path.exists(dense_table_uri):
shutil.rmtree(dense_table_uri)if os.path.exists(sparse_table_uri):
shutil.rmtree(sparse_table_uri)
CSV Ingestion
We recommend running this tutorial, as well as the other tutorials in the Tutorials section, inside TileDB Cloud. By using TileDB Cloud, you can experiment while avoiding all the installation, deployment, and configuration hassles. Sign up for the free tier, spin up a TileDB Cloud notebook with a Python kernel, and follow the tutorial instructions. If you wish to learn how to run tutorials locally on your machine, read the Tutorials: Running Locally tutorial.
This tutorial shows you how to create tables and ingest data to them via directly ingesting a CSV file. You will first perform some basic setup steps, and then you will create two different types of tables, one represented as a 1D dense array, and one as a 2D sparse array. If you wish to understand their differences and impact on performance, read the Tables Data Model section.
Setup
First, import the necessary libraries, set the URIs you will use in this tutorial, and delete any already-created tables with the same name.
You will use a subset from the latest New York City Taxi and Limousine Commission Trip Record Data dataset.
Ingest CSV into a 1D dense array
You can ingest a CSV file into a TileDB table (which TileDB will create if it doesn’t already exist) as follows. Observe that all you need is the source CSV file and the TileDB table URI. Parameter parse_dates
will force the representation of certain CSV fields as datetimes.
Prepare the table for reading using the Python API.
Inspect the schema of the underlying array.
ArraySchema(
domain=Domain(*[
Dim(name='__tiledb_rows', domain=(0, 9), tile=10, dtype='uint64', filters=FilterList([ZstdFilter(level=-1), ])),
]),
attrs=[
Attr(name='VendorID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tpep_pickup_datetime', dtype='datetime64[ns]', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tpep_dropoff_datetime', dtype='datetime64[ns]', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='passenger_count', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='trip_distance', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='RatecodeID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='store_and_fwd_flag', dtype='<U0', var=True, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='PULocationID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='DOLocationID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='payment_type', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='fare_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='extra', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='mta_tax', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tip_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tolls_amount', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='improvement_surcharge', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='total_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='congestion_surcharge', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
],
cell_order='row-major',
tile_order='row-major',
sparse=False,
)
The created table is modeled as a 1D dense TileDB array with a __tiledb_rows
dimension added, along with the CSV fields.
ArraySchema(
domain=Domain(*[
Dim(name='__tiledb_rows', domain=(0, 9), tile=10, dtype='uint64', filters=FilterList([ZstdFilter(level=-1), ])),
]),
attrs=[
Attr(name='VendorID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tpep_pickup_datetime', dtype='datetime64[ns]', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tpep_dropoff_datetime', dtype='datetime64[ns]', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='passenger_count', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='trip_distance', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='RatecodeID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='store_and_fwd_flag', dtype='<U0', var=True, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='PULocationID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='DOLocationID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='payment_type', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='fare_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='extra', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='mta_tax', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tip_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tolls_amount', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='improvement_surcharge', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='total_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='congestion_surcharge', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
],
cell_order='row-major',
tile_order='row-major',
sparse=False,
)
Read data into a dataframe with the .df[]
method:
VendorID | tpep_pickup_datetime | tpep_dropoff_datetime | passenger_count | trip_distance | RatecodeID | store_and_fwd_flag | PULocationID | DOLocationID | payment_type | fare_amount | extra | mta_tax | tip_amount | tolls_amount | improvement_surcharge | total_amount | congestion_surcharge | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 1 | 2020-01-01 00:28:15 | 2020-01-01 00:33:03 | 1 | 1.20 | 1 | N | 238 | 239 | 1 | 6.00 | 3.0 | 0.5 | 1.47 | 0 | 0.3 | 11.27 | 2.5 |
1 | 1 | 2020-01-01 00:35:39 | 2020-01-01 00:43:04 | 1 | 1.20 | 1 | N | 239 | 238 | 1 | 7.00 | 3.0 | 0.5 | 1.50 | 0 | 0.3 | 12.30 | 2.5 |
2 | 1 | 2020-01-01 00:47:41 | 2020-01-01 00:53:52 | 1 | 0.60 | 1 | N | 238 | 238 | 1 | 6.00 | 3.0 | 0.5 | 1.00 | 0 | 0.3 | 10.80 | 2.5 |
3 | 1 | 2020-01-01 00:55:23 | 2020-01-01 01:00:14 | 1 | 0.80 | 1 | N | 238 | 151 | 1 | 5.50 | 0.5 | 0.5 | 1.36 | 0 | 0.3 | 8.16 | 0.0 |
4 | 2 | 2020-01-01 00:01:58 | 2020-01-01 00:04:16 | 1 | 0.00 | 1 | N | 193 | 193 | 2 | 3.50 | 0.5 | 0.5 | 0.00 | 0 | 0.3 | 4.80 | 0.0 |
5 | 2 | 2020-01-01 00:09:44 | 2020-01-01 00:10:37 | 1 | 0.03 | 1 | N | 7 | 193 | 2 | 2.50 | 0.5 | 0.5 | 0.00 | 0 | 0.3 | 3.80 | 0.0 |
6 | 2 | 2020-01-01 00:39:25 | 2020-01-01 00:39:29 | 1 | 0.00 | 1 | N | 193 | 193 | 1 | 2.50 | 0.5 | 0.5 | 0.01 | 0 | 0.3 | 3.81 | 0.0 |
7 | 2 | 2019-12-18 15:27:49 | 2019-12-18 15:28:59 | 1 | 0.00 | 5 | N | 193 | 193 | 1 | 0.01 | 0.0 | 0.0 | 0.00 | 0 | 0.3 | 2.81 | 2.5 |
8 | 2 | 2019-12-18 15:30:35 | 2019-12-18 15:31:35 | 4 | 0.00 | 1 | N | 193 | 193 | 1 | 2.50 | 0.5 | 0.5 | 0.00 | 0 | 0.3 | 6.30 | 2.5 |
9 | 1 | 2020-01-01 00:29:01 | 2020-01-01 00:40:28 | 2 | 0.70 | 1 | N | 246 | 48 | 1 | 8.00 | 3.0 | 0.5 | 2.35 | 0 | 0.3 | 14.15 | 2.5 |
Ingest CSV into a 2D sparse array
You will ingest the same CSV file, but now in a 2D sparse array. Perform the ingestion as follows. Here are some difference to the 1D dense array case:
- Set
sparse
toTrue
. - Set the dimensions in
index_dims
. - Set
allows_duplicates
toTrue
if you wish to allow rows that have the same values across the dimensions. - Set
dim_filters
andattr_filters
if you wish to set your preferred compression filters, instead of using the defaults. - Set
dtype
if you wish to force a data type on certain CSV fields.
tiledb.from_csv(
sparse_table_uri,
example_csv_uri,
ctx=ctx,
sparse=True,
index_dims=["tpep_pickup_datetime", "PULocationID"],
allows_duplicates=True,
dim_filters={
"tpep_pickup_datetime": tiledb.FilterList([tiledb.GzipFilter(level=-1)])
},
attr_filters={"passenger_count": tiledb.FilterList([tiledb.GzipFilter(level=-1)])},
dtype={"fare_amount": np.float32},
parse_dates=["tpep_dropoff_datetime", "tpep_pickup_datetime"],
)
Prepare the table for reading using the Python API.
Inspect the schema of the underlying array.
ArraySchema(
domain=Domain(*[
Dim(name='tpep_pickup_datetime', domain=(numpy.datetime64('2019-12-18T15:27:49.000000000'), numpy.datetime64('2020-01-01T00:55:23.000000000')), tile=numpy.timedelta64(1000,'ns'), dtype='datetime64[ns]', filters=FilterList([GzipFilter(level=-1), ])),
Dim(name='PULocationID', domain=(7, 246), tile=240, dtype='int64', filters=FilterList([ZstdFilter(level=-1), ])),
]),
attrs=[
Attr(name='VendorID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tpep_dropoff_datetime', dtype='datetime64[ns]', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='passenger_count', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([GzipFilter(level=-1), ])),
Attr(name='trip_distance', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='RatecodeID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='store_and_fwd_flag', dtype='<U0', var=True, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='DOLocationID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='payment_type', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='fare_amount', dtype='float32', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='extra', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='mta_tax', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tip_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tolls_amount', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='improvement_surcharge', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='total_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='congestion_surcharge', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
],
cell_order='row-major',
tile_order='row-major',
capacity=10000,
sparse=True,
allows_duplicates=True,
)
The created table is modeled as a 2D dense TileDB array as expected.
ArraySchema(
domain=Domain(*[
Dim(name='tpep_pickup_datetime', domain=(numpy.datetime64('2019-12-18T15:27:49.000000000'), numpy.datetime64('2020-01-01T00:55:23.000000000')), tile=numpy.timedelta64(1000,'ns'), dtype='datetime64[ns]', filters=FilterList([GzipFilter(level=-1), ])),
Dim(name='PULocationID', domain=(7, 246), tile=240, dtype='int64', filters=FilterList([ZstdFilter(level=-1), ])),
]),
attrs=[
Attr(name='VendorID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tpep_dropoff_datetime', dtype='datetime64[ns]', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='passenger_count', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([GzipFilter(level=-1), ])),
Attr(name='trip_distance', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='RatecodeID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='store_and_fwd_flag', dtype='<U0', var=True, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='DOLocationID', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='payment_type', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='fare_amount', dtype='float32', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='extra', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='mta_tax', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tip_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='tolls_amount', dtype='int64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='improvement_surcharge', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='total_amount', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
Attr(name='congestion_surcharge', dtype='float64', var=False, nullable=False, enum_label=None, filters=FilterList([ZstdFilter(level=-1), ])),
],
cell_order='row-major',
tile_order='row-major',
capacity=10000,
sparse=True,
allows_duplicates=True,
)
Read data into a dataframe with the .df[]
method:
VendorID | tpep_dropoff_datetime | passenger_count | trip_distance | RatecodeID | store_and_fwd_flag | DOLocationID | payment_type | fare_amount | extra | mta_tax | tip_amount | tolls_amount | improvement_surcharge | total_amount | congestion_surcharge | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
tpep_pickup_datetime | PULocationID | ||||||||||||||||
2019-12-18 15:27:49 | 193 | 2 | 2019-12-18 15:28:59 | 1 | 0.00 | 5 | N | 193 | 1 | 0.01 | 0.0 | 0.0 | 0.00 | 0 | 0.3 | 2.81 | 2.5 |
2019-12-18 15:30:35 | 193 | 2 | 2019-12-18 15:31:35 | 4 | 0.00 | 1 | N | 193 | 1 | 2.50 | 0.5 | 0.5 | 0.00 | 0 | 0.3 | 6.30 | 2.5 |
2020-01-01 00:01:58 | 193 | 2 | 2020-01-01 00:04:16 | 1 | 0.00 | 1 | N | 193 | 2 | 3.50 | 0.5 | 0.5 | 0.00 | 0 | 0.3 | 4.80 | 0.0 |
2020-01-01 00:09:44 | 7 | 2 | 2020-01-01 00:10:37 | 1 | 0.03 | 1 | N | 193 | 2 | 2.50 | 0.5 | 0.5 | 0.00 | 0 | 0.3 | 3.80 | 0.0 |
2020-01-01 00:28:15 | 238 | 1 | 2020-01-01 00:33:03 | 1 | 1.20 | 1 | N | 239 | 1 | 6.00 | 3.0 | 0.5 | 1.47 | 0 | 0.3 | 11.27 | 2.5 |
2020-01-01 00:29:01 | 246 | 1 | 2020-01-01 00:40:28 | 2 | 0.70 | 1 | N | 48 | 1 | 8.00 | 3.0 | 0.5 | 2.35 | 0 | 0.3 | 14.15 | 2.5 |
2020-01-01 00:35:39 | 239 | 1 | 2020-01-01 00:43:04 | 1 | 1.20 | 1 | N | 238 | 1 | 7.00 | 3.0 | 0.5 | 1.50 | 0 | 0.3 | 12.30 | 2.5 |
2020-01-01 00:39:25 | 193 | 2 | 2020-01-01 00:39:29 | 1 | 0.00 | 1 | N | 193 | 1 | 2.50 | 0.5 | 0.5 | 0.01 | 0 | 0.3 | 3.81 | 0.0 |
2020-01-01 00:47:41 | 238 | 1 | 2020-01-01 00:53:52 | 1 | 0.60 | 1 | N | 238 | 1 | 6.00 | 3.0 | 0.5 | 1.00 | 0 | 0.3 | 10.80 | 2.5 |
2020-01-01 00:55:23 | 238 | 1 | 2020-01-01 01:00:14 | 1 | 0.80 | 1 | N | 151 | 1 | 5.50 | 0.5 | 0.5 | 1.36 | 0 | 0.3 | 8.16 | 0.0 |
Clean up
Delete the created tables.