Skip to content

Commit

Permalink
Merge pull request #1666 from ion-elgreco/docs/update_guide
Browse files Browse the repository at this point in the history
docs: small consistency update in guide and readme
  • Loading branch information
rtyler authored Sep 26, 2023
2 parents 0a5aa39 + b14d7b5 commit 93eb9ce
Show file tree
Hide file tree
Showing 2 changed files with 14 additions and 12 deletions.
5 changes: 2 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,8 +66,7 @@ The `deltalake` library aims to adopt patterns from other libraries in data proc
so getting started should look familiar.

```py3
from deltalake import DeltaTable
from deltalake.write import write_deltalake
from deltalake import DeltaTable, write_deltalake
import pandas as pd

# write some data into a delta table
Expand Down Expand Up @@ -139,7 +138,7 @@ of features outlined in the Delta [protocol][protocol] is also [tracked](#protoc
| S3 - R2 | ![done] | ![done] | requires lock for concurrent writes |
| Azure Blob | ![done] | ![done] | |
| Azure ADLS Gen2 | ![done] | ![done] | |
| Microsoft OneLake | [![open]][onelake-rs] | [![open]][onelake-rs] | |
| Microsoft OneLake | ![done] | ![done] | |
| Google Cloud Storage | ![done] | ![done] | |

### Supported Operations
Expand Down
21 changes: 12 additions & 9 deletions python/docs/source/usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -75,13 +75,16 @@ For Databricks Unity Catalog authentication, use environment variables:
* DATABRICKS_ACCESS_TOKEN

.. code-block:: python
>>> from deltalake import DataCatalog, DeltaTable
>>> catalog_name = 'main'
>>> schema_name = 'db_schema'
>>> table_name = 'db_table'
>>> data_catalog = DataCatalog.UNITY
>>> dt = DeltaTable.from_data_catalog(data_catalog=data_catalog, data_catalog_id=catalog_name, database_name=schema_name, table_name=table_name)
>>> import os
>>> from deltalake import DataCatalog, DeltaTable
>>> os.environ['DATABRICKS_WORKSPACE_URL'] = "https://adb-62800498333851.30.azuredatabricks.net"
>>> os.environ['DATABRICKS_ACCESS_TOKEN'] = "<DBAT>"
>>> catalog_name = 'main'
>>> schema_name = 'db_schema'
>>> table_name = 'db_table'
>>> data_catalog = DataCatalog.UNITY
>>> dt = DeltaTable.from_data_catalog(data_catalog=data_catalog, data_catalog_id=catalog_name, database_name=schema_name, table_name=table_name)
.. _`s3 options`: https://docs.rs/object_store/latest/object_store/aws/enum.AmazonS3ConfigKey.html#variants
.. _`azure options`: https://docs.rs/object_store/latest/object_store/azure/enum.AzureConfigKey.html#variants
Expand Down Expand Up @@ -458,7 +461,7 @@ DataFrame, a PyArrow Table, or an iterator of PyArrow Record Batches.

.. code-block:: python
>>> from deltalake.writer import write_deltalake
>>> from deltalake import write_deltalake
>>> df = pd.DataFrame({'x': [1, 2, 3]})
>>> write_deltalake('path/to/table', df)
Expand Down Expand Up @@ -492,7 +495,7 @@ the method will raise an error.

.. code-block:: python
>>> from deltalake.writer import write_deltalake
>>> from deltalake import write_deltalake
>>> df = pd.DataFrame({'x': [1, 2, 3], 'y': ['a', 'a', 'b']})
>>> write_deltalake('path/to/table', df, partition_by=['y'])
Expand Down

0 comments on commit 93eb9ce

Please sign in to comment.