You can initiate the creation process by clicking the button from the Datasource tab.
Choose structured datasource from the type selection menu:
Select source of data
We have prebuilt connectors to all the major data warehouses. Select your data source from a range of prebuilt connectors.
Depending on your data source, your credentials will look slightly different:
Spreadsheets
Upload file in either CSV or Excel (XLSX, XLS) format.
BigQuery
Json Credentials: This is the file that can be generated when you create the service account on BigQuery.
PostgreSQL
User: The PostgreSQL username.
Password: The PostgreSQL database password. (Example: supersecure!)
Host: This is where your database is active. (Example: rds.amazonaws.com, 43.205.136.128)
Port: The port your database uses. (Example: 5432)
Database Name: The name of the database you are connecting to. (Example: postgres)
MariaDB
User: The MariaDB username.
Password: The MariaDB database password. (Example: supersecure!)
Host: This is where your database is active. (Example: rds.amazonaws.com, 43.205.136.128)
Port: The port your database uses. (Example: 5432)
Snowflake
User: The Snowflake username.
Password: The Snowflake data warehouse password. (Example: supersecure!)
Account: Your Snowflake account name. (Example: admin)
Warehouse: Your Snowflake warehouse name. (Example: shopifydata)
Database Collection: Specific name of Snowflake Database you want to connect to. (Example: sales)
My SQL
User: The My SQL username.
Password: The My SQL database password. (Example: supersecure!)
Host: This is where your database is active. (Example: rds.amazonaws.com, 43.205.136.128)
Port: The port your database uses. (Example: 5432)
AWS Athena
AWS Access Key: The access key for your AWS account, used to authenticate API requests. (Example: AKIAIOSFODNN7EXAMPLE)
AWS Secret Key: The secret key associated with your AWS access key, providing secure access to your account. (Example: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY)
Region: The AWS region where your Athena instance and related services are hosted. (Example: us-west-2)
S3 Output Location: The S3 bucket and path where Amazon Athena query results will be stored. (Example: s3://my-athena-results-bucket/path/to/output/)
Databricks
Server Hostname: The hostname of your Databricks server where your cluster or workspace is running. (Example: databricks-instance.cloud.databricks.com)
HTTP Path: The HTTP path to the Databricks cluster or SQL endpoint you are connecting to. (Example: /sql/1.0/endpoints/123456789abcdef)
Access Token: The personal access token for authenticating API requests and connections to Databricks. (Example: dapiexample01234abcd56789efghijklmnop)
AWS Redshift
User: The Redshift username used to authenticate and access the database. (Example: admin)
Password: The password for your Redshift database user. (Example: supersecurepassword!)
Host: The hostname or endpoint where your Redshift cluster is running. (Example: redshift-cluster.example.us-west-2.redshift.amazonaws.com)
Port: The port number used by the Redshift database. (Example: 5439)
DB Schema: The specific schema within the Redshift database that you want to query. (Example: public)
Database: The name of the Redshift database to connect to. (Example: mydatabase)
If you need assistance connecting to LLMate, reach out to us: support@llmate.ai
Your datasource is now ready for Querying!
You can also upload a csv file with tabular data in it. For larger datasets it is recommended to store it in a warehouse and connect it to LLMate.ai via read only access. Select your datasource and click .