-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Downloading data from chunks is not implemented yet. #13
Comments
Hello! Yes, Snowflake returns "big" response data in "chunks" (pieces), so client should download and combine them locally to get full result. Unfortunately this feature is not implemented yet. |
Hi Ilya, Thanks for your response. How would you read large data sets with the library? E.g. if I execute "snowflakeClient.QueryRawResponseAsync("select top 500 * from software;")" then I see that there are 2 chunks and 0 rows in the result set. I see that chunk 1 has 361 rows and the chunk 2 has the rest, i.e. 139 rows. I can't find any way to read those chunks though, how can it be done? |
Hi! So each chunk has URL property ( |
Any chance that downloading chunks will be implemented soon? The Snowflake.Data API is choking on large result sets and my quick code example of downloading 25 chunks worked super fast. I like what I see from the rest of this API, but we definitely need to be able to download result sets with more than 1,000 records. |
@andras-nemes-snowsoftware, @rteising It does this automatically, but if you want - you can try to use var queryResult = await _snowflakeClient.QueryRawResponseAsync("select top 10000 * from SNOWFLAKE_SAMPLE_DATA.TPCH_SF1000.SUPPLIER;");
var chunksDownloadInfo = new ChunksDownloadInfo() { ChunkHeaders = queryResult.ChunkHeaders, Chunks = queryResult.Chunks, Qrmk = queryResult.Qrmk };
var parsedChunks = await ChunksDownloader.DownloadAndParseChunksAsync(chunksDownloadInfo);
var suppliers = SnowflakeDataMapper.MapTo<Supplier>(queryResult.Columns, parsedChunks); Unfortunately I can't test this feature extensively, because I'm using free tier account in SF, so (as always) I would appreciate any feedback about it. |
Thanks for your work Ilya, it's much appreciated! |
@fixer-m is there a way to download and process one chunk at a time? The above example looks like it downloads all chunks and then assigns it to the |
Hello,
I get the exception in the title when I run QueryAsync with a "large enough" result. E.g. if I have 1000 elements in the table and execute a SELECT * FROM on the table then snowflakeClient.QueryAsync throws this exception. There seems to be no fix limit on the number of rows in the data set but it's rather the total size of the return data.
Is this a known issue?
Thanks,
Andras
The text was updated successfully, but these errors were encountered: