Hi, spark-sql estimated input for Cassandra table with 3 rows as 8 TB.
sometimes it's estimated as -167B.
I run it on laptop, I don't have 8 TB space for the data.

We use DSE 4.7 with bundled spark and spark-sql-thriftserver

Here is the stat for a dummy select foo from bar where bar three rows and
several columns


   - *Total task time across all tasks: *7.6 min
   - *Input: *8388608.0 TB

I don't have so much TB on my macbook pro. I would like to, but I dont :(

Reply via email to