What are the functions (for example) are available/not available to get
transformed to GPU source code?

What is the factor value u consider to get multiplied with actual cost for
CPU? For example, default cpu_tuple_cost is 0.01. 

Consider, for example, if the cost=0.00..458.00 for seq scan, how can it be
multiplied to get the cost for GPU? considering any one gpu card.

Is there any documentation regarding these details in GPU?



--
View this message in context: 
http://postgresql.nabble.com/How-the-Planner-in-PGStrom-differs-from-PostgreSQL-tp5929724p5931271.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to