Skip to content

Commit

Permalink
docs: remove snowflake, add row supported DBs (#587)
Browse files Browse the repository at this point in the history
  • Loading branch information
nehanene15 committed Sep 14, 2022
1 parent da4faaf commit 1d923f5
Show file tree
Hide file tree
Showing 3 changed files with 22 additions and 24 deletions.
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ perform this task.

DVT supports the following validations:
* Column validation (count, sum, avg, min, max, group by)
* Row validation (BQ, Hive, and Teradata only)
* Row validation (BQ, Hive, Teradata, Oracle, SQL Server only)
* Schema validation
* Custom Query validation
* Ad hoc SQL exploration
Expand All @@ -31,7 +31,6 @@ DVT supports the following connection types:
* [Oracle](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/docs/connections.md#oracle)
* [Postgres](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/docs/connections.md#postgres)
* [Redshift](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/docs/connections.md#redshift)
* [Snowflake](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/docs/connections.md#snowflake)
* [Spanner](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/docs/connections.md#google-spanner)
* [Teradata](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/docs/connections.md#teradata)

Expand Down Expand Up @@ -134,9 +133,10 @@ The [Examples](https://github.com/GoogleCloudPlatform/professional-services-data

#### Row Validations

(Note: Row hash validation is currently only supported for BigQuery, Teradata, and Imapala/Hive. Struct and array
data types are not currently supported. In addition, please note that SHA256 is not a supported function on teradata
systems. If you wish to perform this comparison on teradata you will need to
(Note: Row hash validation is currently supported for BigQuery, Teradata, Impala/Hive, Oracle, and SQL Server.
Struct and array data types are not currently supported and random row is not yet supported for Oracle or SQL Server.
In addition, please note that SHA256 is not a supported function on Teradata systems.
If you wish to perform this comparison on Teradata you will need to
[deploy a UDF to perform the conversion](https://github.com/akuroda/teradata-udf-sha2/blob/master/src/sha256.c).)

Below is the command syntax for row validations. In order to run row level
Expand Down Expand Up @@ -272,7 +272,7 @@ page provides few examples of how this tool can be used to run custom query vali
#### Custom Query Row Validations

(Note: Row hash validation is currently only supported for BigQuery, Teradata, and
Imapala/Hive. Struct and array data types are not currently supported.)
Impala/Hive. Struct and array data types are not currently supported.)

Below is the command syntax for row validations. In order to run row level
validations you need to pass `--hash` flag which will specify the fields
Expand Down Expand Up @@ -535,4 +535,4 @@ cast to NUMERIC.

## Contributing

Contributions are welcome. See the [contributing guide](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/CONTRIBUTING.md) for details.
Contributions are welcome. See the [Contributing guide](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/CONTRIBUTING.md) for details.
16 changes: 0 additions & 16 deletions docs/connections.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,6 @@ The data validation tool supports the following connection types.
* [Teradata](#teradata)
* [Oracle](#oracle)
* [MSSQL](#mssql-server)
* [Snowflake](#snowflake)
* [Postgres](#postgres)
* [MySQL](#mysql)
* [Redshift](#redshift)
Expand Down Expand Up @@ -172,21 +171,6 @@ Then `pip install pyodbc`.
}
```

## Snowflake
```
{
# Configuration Required for All Data Sources
"source_type": "Snowflake",
# Connection Details
"user": "my-user",
"password": "my-password",
"account": "Snowflake account to connect to"
"database":"my-db"
"schema": "my-schema"
}
```

## Postgres
```
{
Expand Down
16 changes: 15 additions & 1 deletion third_party/ibis/ibis_snowflake/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,21 @@ The Snowflake client is accessible through the ibis.ibis_snowflake namespace. Th
result=tb_name.count().execute()
print(result)**

#
# Snowflake DVT connection
```
{
# Configuration Required for All Data Sources
"source_type": "Snowflake",
# Connection Details
"user": "my-user",
"password": "my-password",
"account": "Snowflake account to connect to"
"database":"my-db"
"schema": "my-schema"
}
```

# **3.Usage**

- Schema for the 'students_pointer' table:-
Expand Down

0 comments on commit 1d923f5

Please sign in to comment.