You'd need to have a profile for each user, so you can track if they reach
the URL (true/false) and then use their email or internal ID or a generated
UUID for the certificate (you could store this extra code as well if
needed).
For generating a certificate, I'd use a PDF format; this would cre
Hi here, for those of you looking for resouces to learn django, you can
check these telegram channels for leaning ressources.
https://t.me/djangonautees
https://t.me/joinchat/AFg8fQgHoBk5uPMCvQ
--
You received this message because you are subscribed to the Google Groups
"Django users" gr
I have an custom model field in which i have implemented cryptography. I
get prefect results when I don't use lookups for filtering data i.e. it
calls *get_prep_value()* But when I filter data using lookups like
*startswith* or *endswith*. it does not calls get_prep_value().How to solve
this pr
Hello,
I'm working on a data and calculation intensive django application. It
uses pandas and plotly.py to create many charts and graphs. I have some
control over the design of the application but not total control. One
decision that I stuck with is that all charts and graphs are preprocess
Could you run a cron job on the system to analyze the data periodically?
Sent from my iPhone
> On Oct 22, 2020, at 1:30 PM, Lois Greene-Hernandez wrote:
>
> Hello,
>
> I'm working on a data and calculation intensive django application. It uses
> pandas and plotly.py to create many charts a
Well it's data that I need to populate the pages. It's very processor
intensive and I'll need to pass it back to the views once it's processed.
Can I pass data back to the views once I've run a task? I'm looking into
caching alternatives.
Thanks
Lois
On Thu, Oct 22, 2020 at 4:55 PM Scott Sawyer
Hi all,
If a record is updated more than once in a transaction causing something to
need to happen on commit, but don't want to suffer the performance cost of
unnecessarily doing that post-commit activity multiple times, is there any
clean way to de-duplicate that?
If on_commit has an optional
Hi Lois,
To expand on what Scott is saying.
- model DB tables to store the data you need to render the graphs
- build a service to preprocess the data - this service can run
periodically depending on how up to date you need the graphs to be (a
microservice)
- build a single endpoint to simple fetc
dumpdata and loaddata work for me on some smaller applications, but I am
wondering whether any of you have strategies to scale it up to more data.
My DevOps doesn't really give developers access to much - not even RDS
snapshots. The larger applications have written some logic to move data,
bu
I have found an answer of sorts in a previous thread. Fred says:
> Just for the follow-up, I ended up using pgloader and after some argument
tweaking, it worked. Cheers.
On Thursday, October 22, 2020 at 6:48:57 PM UTC-4 Dan Davis wrote:
> dumpdata and loaddata work for me on some smaller applic
Well, it's a bit of a blunt weapon, but Python is going to run any code it
finds at module scope as the process is initialised. So you could cause
your heavy stuff to be run there? Or at least spawned from there?
On Thu, 22 Oct 2020, 22:59 Okware Aldo, wrote:
> Hi Lois,
>
> To expand on what Sco
This is perfect. Thank you Derek!
On Thu, Oct 22, 2020 at 9:14 AM Derek wrote:
> You'd need to have a profile for each user, so you can track if they reach
> the URL (true/false) and then use their email or internal ID or a generated
> UUID for the certificate (you could store this extra code as
12 matches
Mail list logo