Hi there - know this might be a silly question, but asking anyway... As
in, know these formats/data-types are probably not really possible to
compress any more than they already are.
Have managed to sort out capturing screenshots repeatedly, while
recording audio in the background, using
On 3/10/24 11:48 am, Left Right wrote:
So, streaming parsers (eg. SAX) are written for a regular language
that approximates XML.
SAX doesn't parse a whole XML document, it parses small pieces of it
independently and passes them on. It's more like a lexical analyser than
a parser in that respect
This thread is derailing.
Please consider it closed.
--
~Ethan~
Moderator
--
https://mail.python.org/mailman/listinfo/python-list
> You can't validate an IP packet without having all of it. Your notion
> of "streaming" is nonsensical.
Whoa, whoa, hold your horses! "nonsensical" needs a little bit of
justification :)
It seems you don't understand the difference between words and
languages! In my examples, IP _protocol_ is th
> One single IP packet is all you can parse.
I worked for an undisclosed company which manufactures h/w for ISPs
(4- and 8-unit boxes you mount on a rack in a datacenter).
Essentially, big-big routers. So, I had the pleasure of writing
software that parses IP _protocol_, and let me tell you: you
On Thu, 3 Oct 2024 at 08:48, Left Right wrote:
>
> > You can't validate an IP packet without having all of it. Your notion
> > of "streaming" is nonsensical.
>
> Whoa, whoa, hold your horses! "nonsensical" needs a little bit of
> justification :)
>
> It seems you don't understand the difference be
On Wed, 2 Oct 2024 at 23:53, Left Right via Python-list
wrote:
> In the same email you replied to, I gave examples of languages for
> which parsers can be streaming (in general): SCSI or IP.
You can't validate an IP packet without having all of it. Your notion
of "streaming" is nonsensical.
Chri
> By that definition of "streaming", no parser can ever be streaming,
> because there will be some constructs that must be read in their
> entirety before a suitably-structured piece of output can be
> emitted.
In the same email you replied to, I gave examples of languages for
which parsers can be
On 2/10/24 12:26 pm, avi.e.gr...@gmail.com wrote:
The real problem is how the JSON is set up. If you take umpteen data
structures and wrap them all in something like a list, then it may be a tad
hard to stream as you may not necessarily be examining the contents till the
list finishes gigabytes
, from transmitting and receiving single
bits over a wire all the way up to what are now known as session and
presentation layers. Some imposed maximum lengths in certain places;
some allowed for indefinite amounts of data to be transferred from one
end to the other without stopping, resetting, o
next element from
142857 each time from a circular loop.
Sines, cosines, pi, e and so on, can often be calculated to arbitrary
precision by evaluating things like infinite Taylor Series as many times as
needed up to the precision of the data holding the number as you move along.
Similar ideas allow
to destage arbitrary large file (or a chunk of file) to disk.
But SCSI is built of finite "words" and to describe an arbitrary large
file you'd need to list all the blocks that constitute the file!
I don't follow. What fsync() does is ensure that any data buffered
in the kernel relat
On 1/10/24 8:34 am, Left Right wrote:
You probably forgot that it has to be _streaming_. Suppose you parse
the first digit: can you hand this information over to an external
function to process the parsed data? -- No! because you don't know the
magnitude yet.
By that definition of "
to sync _everything_ (and it hurts!)
On Tue, Oct 1, 2024 at 5:49 PM Dan Sommers via Python-list
wrote:
>
> On 2024-09-30 at 21:34:07 +0200,
> Regarding "Re: Help with Streaming and Chunk Processing for Large JSON Data
> (60 GB) from Kenna API,"
> Left Right via Python-lis
On 2024-09-30 at 21:34:07 +0200,
Regarding "Re: Help with Streaming and Chunk Processing for Large JSON Data (60
GB) from Kenna API,"
Left Right via Python-list wrote:
> > What am I missing? Handwavingly, start with the first digit, and as
> > long as the next character
On 2024-09-30 at 18:48:02 -0700,
Keith Thompson via Python-list wrote:
> 2qdxy4rzwzuui...@potatochowder.com writes:
> [...]
> > In Common Lisp, you can write integers as #nnR[digits], where nn is the
> > decimal representation of the base (possibly without a leading zero),
> > the # and the R are
with the least
> significant digit?
You probably forgot that it has to be _streaming_. Suppose you parse
the first digit: can you hand this information over to an external
function to process the parsed data? -- No! because you don't know the
magnitude yet. What about two digits? -- Same thin
2qdxy4rzwzuui...@potatochowder.com writes:
[...]
> In Common Lisp, you can write integers as #nnR[digits], where nn is the
> decimal representation of the base (possibly without a leading zero),
> the # and the R are literal characters, and the digits are written in
> the intended base. So the inp
On 2024-10-01 at 09:09:07 +1000,
Chris Angelico via Python-list wrote:
> On Tue, 1 Oct 2024 at 08:56, Grant Edwards via Python-list
> wrote:
> >
> > On 2024-09-30, Dan Sommers via Python-list wrote:
> >
> > > In Common Lisp, integers can be written in any integer base from two
> > > to thirty s
On Tue, 1 Oct 2024 at 08:56, Grant Edwards via Python-list
wrote:
>
> On 2024-09-30, Dan Sommers via Python-list wrote:
>
> > In Common Lisp, integers can be written in any integer base from two
> > to thirty six, inclusive. So knowing the last digit doesn't tell
> > you whether an integer is ev
On 2024-09-30, Dan Sommers via Python-list wrote:
> In Common Lisp, integers can be written in any integer base from two
> to thirty six, inclusive. So knowing the last digit doesn't tell
> you whether an integer is even or odd until you know the base
> anyway.
I had to think about that for an
On 2024-10-01 at 04:46:35 +1000,
Chris Angelico via Python-list wrote:
> On Tue, 1 Oct 2024 at 04:30, Dan Sommers via Python-list
> wrote:
> >
> > But why do I need to start with the least
> > significant digit?
>
> If you start from the most significant, you don't know anything about
> the num
On 9/30/2024 11:30 AM, Barry via Python-list wrote:
On 30 Sep 2024, at 06:52, Abdur-Rahmaan Janhangeer via Python-list
wrote:
import polars as pl
pl.read_json("file.json")
This is not going to work unless the computer has a lot more the 60GiB of RAM.
As later suggested a streaming par
> Streaming won't work because the file is gzipped. You have to receive
> the whole thing before you can unzip it. Once unzipped it will be even
> larger, and all in memory.
GZip is specifically designed to be streamed. So, that's not a
problem (in principle), but you would need to have a stream
On 9/30/2024 1:00 PM, Chris Angelico via Python-list wrote:
On Tue, 1 Oct 2024 at 02:20, Thomas Passin via Python-list
wrote:
On 9/30/2024 11:30 AM, Barry via Python-list wrote:
On 30 Sep 2024, at 06:52, Abdur-Rahmaan Janhangeer via Python-list
wrote:
import polars as pl
pl.read_json("
On Tue, 1 Oct 2024 at 04:30, Dan Sommers via Python-list
wrote:
>
> But why do I need to start with the least
> significant digit?
If you start from the most significant, you don't know anything about
the number until you finish parsing it. There's almost nothing you can
say about a number given
On 2024-09-30, Dan Sommers via Python-list wrote:
> On 2024-09-30 at 11:44:50 -0400,
> Grant Edwards via Python-list wrote:
>
>> On 2024-09-30, Left Right via Python-list wrote:
>> > [...]
>> > Imagine a pathological case of this shape: 1... <60GB of digits>. This
>> > is still a valid JSON (it
On 2024-09-30 at 11:44:50 -0400,
Grant Edwards via Python-list wrote:
> On 2024-09-30, Left Right via Python-list wrote:
> > Whether and to what degree you can stream JSON depends on JSON
> > structure. In general, however, JSON cannot be streamed (but commonly
> > it can be).
> >
> > Imagine a
On Tue, 1 Oct 2024 at 02:20, Thomas Passin via Python-list
wrote:
>
> On 9/30/2024 11:30 AM, Barry via Python-list wrote:
> >
> >
> >> On 30 Sep 2024, at 06:52, Abdur-Rahmaan Janhangeer via Python-list
> >> wrote:
> >>
> >>
> >> import polars as pl
> >> pl.read_json("file.json")
> >>
> >>
> >
>
On 9/30/2024 11:30 AM, Barry via Python-list wrote:
On 30 Sep 2024, at 06:52, Abdur-Rahmaan Janhangeer via Python-list
wrote:
import polars as pl
pl.read_json("file.json")
This is not going to work unless the computer has a lot more the 60GiB of RAM.
As later suggested a streaming par
On 2024-09-30, Left Right via Python-list wrote:
> Whether and to what degree you can stream JSON depends on JSON
> structure. In general, however, JSON cannot be streamed (but commonly
> it can be).
>
> Imagine a pathological case of this shape: 1... <60GB of digits>. This
> is still a valid JSON
> On 30 Sep 2024, at 06:52, Abdur-Rahmaan Janhangeer via Python-list
> wrote:
>
>
> import polars as pl
> pl.read_json("file.json")
>
>
This is not going to work unless the computer has a lot more the 60GiB of RAM.
As later suggested a streaming parser is required.
Barry
--
https://m
4 at 8:44 AM Asif Ali Hirekumbi via Python-list
wrote:
>
> Thanks Abdur Rahmaan.
> I will give it a try !
>
> Thanks
> Asif
>
> On Mon, Sep 30, 2024 at 11:19 AM Abdur-Rahmaan Janhangeer <
> arj.pyt...@gmail.com> wrote:
>
> > Idk if you tried Polars, but i
Thanks Abdur Rahmaan.
I will give it a try !
Thanks
Asif
On Mon, Sep 30, 2024 at 11:19 AM Abdur-Rahmaan Janhangeer <
arj.pyt...@gmail.com> wrote:
> Idk if you tried Polars, but it seems to work well with JSON data
>
> import polars as pl
> pl.read_json("file.json")
Idk if you tried Polars, but it seems to work well with JSON data
import polars as pl
pl.read_json("file.json")
Kind Regards,
Abdur-Rahmaan Janhangeer
about <https://compileralchemy.github.io/> | blog
<https://www.pythonkitchen.com>
github <https://github.com/Abdur-Rah
Dear Python Experts,
I am working with the Kenna Application's API to retrieve vulnerability
data. The API endpoint provides a single, massive JSON file in gzip format,
approximately 60 GB in size. Handling such a large dataset in one go is
proving to be quite challenging, especially in ter
Stefan Ram wrote:
> Chris Green wrote or quoted:
> >That's exactly the sort of solution I was wondering about. Is there a
> >ready made module/library for handling this sort of thing? Basically
> >it will just be a string of a few tens of characters that would be
> >kept up to date by one proce
eeded, a simple file lock can then be used
> > to prevent simultaneous access (well, simultaneous access when the
> > writing process is writing).
>
> The thing with a file is, it persists even when the collector process is
> not running. Do you want data that persists whe
Piergiorgio Sartor
wrote:
> On 06/07/2024 09.28, Chris Green wrote:
> > I have a Raspberry Pi in my boat that uses I2C to read a number of
> > voltages and currents (using ADS1115 A2D) so I can monitor the battery
> > condition etc.
> >
> > At present various different scripts (i.e. processes) j
If resource usage isn't an issue, then the _easy_ thing to do, that
would also be easily correct is to have a server doing all the
h/w-related reading and clients talking to that server. Use for the
server the technology you feel most confident with. Eg. you may use
Python's http package. I believe
> On 7 Jul 2024, at 23:47, MRAB via Python-list wrote:
>
> For clarity I'd recommend os.replace instead. This is because on Windows
> os.rename it would complain if the target file already exists, but os.replace
> has the same behaviour on both Linux and Windows.
Agreed.
In this case the O
On 2024-07-07 23:27, Barry via Python-list wrote:
On 7 Jul 2024, at 22:13, Chris Green via Python-list
wrote:
a simple file lock can then
be used to prevent simultaneous access (well, simultaneous access when
the writing process is writing).
There is a simple pattern to make this robust.
> On 7 Jul 2024, at 22:13, Chris Green via Python-list
> wrote:
>
> a simple file lock can then
> be used to prevent simultaneous access (well, simultaneous access when
> the writing process is writing).
There is a simple pattern to make this robust.
Write new values to a tmp file.
Close th
On 06/07/2024 12:32, Stefan Ram wrote:
But why overengineer? If you feel comfortable with the file
solution, go for it! The only drawback might be that it's a
bit slower than other approaches.
I absolutely agree. Overengineering is generally a bad idea because
you're using a complex s
On 06/07/2024 09.28, Chris Green wrote:
I have a Raspberry Pi in my boat that uses I2C to read a number of
voltages and currents (using ADS1115 A2D) so I can monitor the battery
condition etc.
At present various different scripts (i.e. processes) just read the
values using the I2C bus whenever t
I have a Raspberry Pi in my boat that uses I2C to read a number of
voltages and currents (using ADS1115 A2D) so I can monitor the battery
condition etc.
At present various different scripts (i.e. processes) just read the
values using the I2C bus whenever they need to but I'm pretty sure
this (quit
Virtual meeting, Wed 17 April, 1800 for 1830 (NZST, ie 0630 UTC)
Data Ethics
Emma McDonald is the Director of the Interim Centre for Data Ethics and
Innovation at Stats NZ (New Zealand Government Department of Statistics)
Emma will talk about why Stats NZ is establishing a Centre for Data
. Instead, he’ll present a different
perspective on what this data structure is, and how it differs from a
list. The presentation will compare deques and lists in a visual manner,
to help us understand why we may need a deque in certain situations.
We’ll also explore some demonstration examples to
' Group meeting (suitably translated, etc)...
On 27/06/2023 05.46, small marcc via Python-list wrote:
This code creates the path to the Excel file where the data will be written. It
checks if the file already exists, and if so, reads the existing data into a
DataFrame. Otherwise, it
On 6/26/2023 1:46 PM, small marcc via Python-list wrote:
pandas.ExcelWriter
import pandas
This code creates the path to the Excel file where the data will be written. It
checks if the file already exists, and if so, reads the existing data into a
DataFrame. Otherwise, it creates a new empty
pandas.ExcelWriter
import pandas
This code creates the path to the Excel file where the data will be written. It
checks if the file already exists, and if so, reads the existing data into a
DataFrame. Otherwise, it creates a new empty DataFrame. Then it concatenates
the existing data with the
On 2/8/2023 6:39 AM, Shaozhong SHI wrote:
What is the robust way to use Python to read in an XML and turn it into
a JSON file?
JSON dictionary is actually a tree. It is much easier to manage the
tree-structured data.
XML and JSON are both for interchanging data. What are you trying to
What is the robust way to use Python to read in an XML and turn it into a
JSON file?
JSON dictionary is actually a tree. It is much easier to manage the
tree-structured data.
Regards,
David
--
https://mail.python.org/mailman/listinfo/python-list
│ │ └── test-documents
│ └── validate-semantic
│ ├── 2and3
│ ├── bugs
│ └── oas3
└── standalone
└── topbar-insert
I just thought that it would be great if there was a Python utility that
visualized a similar graph for nested data structures.
Of course I am aware of indent
you rock. Thank you, Stefan.
Dino
On 1/21/2023 2:41 PM, Stefan Ram wrote:
r...@zedat.fu-berlin.de (Stefan Ram) writes:
def display_( object, last ):
directory = object; result = ''; count = len( directory )
for entry in directory:
count -= 1; name = entry; indent = ''
https://docs.python.org/3/library/pprint.html
From: Python-list on
behalf of Dino
Date: Saturday, January 21, 2023 at 11:42 AM
To: python-list@python.org
Subject: tree representation of Python data
*** Attention: This is an external email. Use caution responding, opening
attachments or
-semantic
│ ├── 2and3
│ ├── bugs
│ └── oas3
└── standalone
└── topbar-insert
I just thought that it would be great if there was a Python utility that
visualized a similar graph for nested data structures.
Of course I am aware of indent (json.dumps()) and pprint, and they are
Pablo Martinez Ulloa wrote at 2022-5-18 15:08 +0100:
>I have been using your C++ Python API, in order to establish a bridge from
>C++ to Python.
Do you know `cython`?
It can help very much in the implementation of bridges between
Python and C/C++.
--
https://mail.python.org/mailman/listinfo/pytho
Am 18.05.22 um 16:08 schrieb Pablo Martinez Ulloa:
I have been using your C++ Python API, in order to establish a bridge from
C++ to Python. We want to do this, as we have a tactile sensor, which only
has a library developed in C++, but we want to obtain the data in real time
in Python to
On 2022-05-18 15:08, Pablo Martinez Ulloa wrote:
Hello,
I have been using your C++ Python API, in order to establish a bridge from
C++ to Python. We want to do this, as we have a tactile sensor, which only
has a library developed in C++, but we want to obtain the data in real time
in Python to
Hello,
I have been using your C++ Python API, in order to establish a bridge from
C++ to Python. We want to do this, as we have a tactile sensor, which only
has a library developed in C++, but we want to obtain the data in real time
in Python to perform tests with a robotic arm and gripper. The
On 26Mar2022 15:47, alberto wrote:
>Hi to everyone,
>I would save data from multiple files in one using pandas.
>below my script
Well, it looks like you're doing the right thing. You've got this:
results = pd.DataFrame()
for counter, current_file in enumerate(glob.
Hi to everyone,
I would save data from multiple files in one using pandas.
below my script
# Read results GCMG LAMMPS
import pandas as pd
import os
import glob
path =
r"C:\Users\Documenti\Pyton\plot\results_CH4_180K\METHANE_180K_LJ_2.5-35.0_bar"
os.chdir(path)
results = pd
Why not just have scripts that echo out the various sets of test
data you are interested in? That way, Popen would
always be your interface and you wouldn't have to
make two cases in the consumer script.
In other words, make program that outputs test
data just like your main data source pr
I want to save
various possible outputs of the command as text files and use those as
inputs.
What format should I use to pass data to the actual parsing function?
Is this a command you run, produces that output, and then stops (as
opposed to a long-running program that from time to time
command as text files and use those as
inputs.
What format should I use to pass data to the actual parsing function?
Is this a command you run, produces that output, and then stops (as
opposed to a long-running program that from time to time generates a
bunch of output)?
Because in that case I
Loris Bennett wrote at 2022-3-11 07:40 +0100:
> ... I want to test the parsing ...
>Sorry if I was unclear but my question is:
>
>Given that the return value from Popen is a Popen object and given that
>the return value from reading a file is a single string or maybe a list
>of strings, what should
Dieter Maurer writes:
> Loris Bennett wrote at 2022-3-10 13:16 +0100:
>>I have a command which produces output like the
>>following:
>>
>> Job ID: 9431211
>> Cluster: curta
>> User/Group: build/staff
>> State: COMPLETED (exit code 0)
>> Nodes: 1
>> Cores per node: 8
>> CPU Utilized: 01:30:
Loris Bennett wrote at 2022-3-10 13:16 +0100:
>I have a command which produces output like the
>following:
>
> Job ID: 9431211
> Cluster: curta
> User/Group: build/staff
> State: COMPLETED (exit code 0)
> Nodes: 1
> Cores per node: 8
> CPU Utilized: 01:30:53
> CPU Efficiency: 83.63% of 01:4
format should I use to pass data to the actual parsing function?
I could in both production and test convert the entire input to a string
and pass the string to the parsing method.
However, I could use something like
test_input_01 = subprocess.Popen(
["cat test_input_0
Dennis Lee Bieber writes:
> On Tue, 01 Mar 2022 08:35:05 +0100, Loris Bennett
> declaimed the following:
>
>>Thanks for the various suggestions. The data I need to store is just a
>>dict with maybe 3 or 4 keys and short string values probably of less
>>than 32 char
Robert Latest writes:
> Loris Bennett wrote:
>> Thanks for the various suggestions. The data I need to store is just a
>> dict with maybe 3 or 4 keys and short string values probably of less
>> than 32 characters each per event. The traffic on the DB is going to be
>&g
Loris Bennett wrote:
> Thanks for the various suggestions. The data I need to store is just a
> dict with maybe 3 or 4 keys and short string values probably of less
> than 32 characters each per event. The traffic on the DB is going to be
> very low, creating maybe a dozen events a
ey=True)
>> date = Column('date', Date, nullable=False)
>> uid = Column('gid', String(64), ForeignKey('users.uid'), nullable=False)
>> info = ??
>>
>>The event may have arbitrary, but dict-like data associated with it,
>>w
ullable=False)
> uid = Column('gid', String(64), ForeignKey('users.uid'), nullable=False)
> info = ??
>
>The event may have arbitrary, but dict-like data associated with it,
>which I want to add in the field 'info'. This data never needs to be
>mod
ned to use JSON if the data is something that can
be easily represented that way.
--
Greg
--
https://mail.python.org/mailman/listinfo/python-list
Albert-Jan Roskam wrote:
> The event may have arbitrary, but dict-like data associated with it,
> which I want to add in the field 'info'. This data never needs to be
> modified, once the event has been inserted into the DB.
>
> What type should t
te', Date, nullable=False)
uid = Column('gid', String(64), ForeignKey('users.uid'),
nullable=False)
info = ??
The event may have arbitrary, but dict-like data associated with it,
which I want to add in the field 'info'. This data never
ForeignKey('users.uid'), nullable=False)
info = ??
The event may have arbitrary, but dict-like data associated with it,
which I want to add in the field 'info'. This data never needs to be
modified, once the event has been inserted into the DB.
What type should the info field hav
> Hi.I am learning python and I am working with some netCDF files.
> Suppose I have temperature data from 1950-2020 and I want data for
> only 1960-2015. How should I extract it. --
Alternately, use https://unidata.github.io/netcdf4-python/ or gdal.
It might also be possible to r
Hi.I am learning python and I am working with some netCDF files. Suppose I have
temperature data from 1950-2020 and I want data for only 1960-2015. How should
I extract it.
--
https://mail.python.org/mailman/listinfo/python-list
On 2022-02-10 17:20, BmoreIT wrote:
I did a data export from Google to export all company data - Google Data Export
It shows the root folder and to download, I run this command (it automatically
enters this command)
gsutil -m cp -r \ "gs://takeout-export-myUniqueID" \.
But I ha
I did a data export from Google to export all company data - Google Data Export
It shows the root folder and to download, I run this command (it automatically
enters this command)
gsutil -m cp -r \ "gs://takeout-export-myUniqueID" \.
But I have no idea where it would save it being
On Thu, 3 Feb 2022 at 13:32, Avi Gross via Python-list
wrote:
>
> Jen,
>
> I would not be shocked at incompatibilities in the system described making it
> hard to exchange anything, including text, but am not clear if there is a
> limitation of four bytes in what can be shared. For me, a charact
@python.org
Sent: Wed, Feb 2, 2022 1:27 pm
Subject: Re: Data unchanged when passing data to Python in multiprocessing
shared memory
An ASCII string will not work. If you convert 32894 to an ascii string you
will have five bytes, but you need four. In my original post I showed the C
program I
>>>
>> Look at the struct module. I'm pretty certain it has flags for big or
>> little end, or system native (that, or run your integers through the
>> various "network byte order" functions that I think C and Python both
>> support.
>>
>>
e first character of the format string can be used to
indicate the byte order, size and alignment of the packed data, according
to the following table:
Character Byte order SizeAlignment
@ native native native
= native
ght solution but wonder if a more trivial
> solution is even being considered. It ignores big and little endians and just
> converts your data into another form and back.
>
> If all you want to do is send an integer that fit in 32 bits or 64 bits, why
> not convert it to a characte
y certain it has flags for big or
> little end, or system native (that, or run your integers through the
> various "network byte order" functions that I think C and Python both
> support.
>
> https://www.gta.ufrj.br/ensino/eel878/sockets/htonsman.html
>
>
> >However, if a
I applaud trying to find the right solution but wonder if a more trivial
solution is even being considered. It ignores big and little endians and just
converts your data into another form and back.
If all you want to do is send an integer that fit in 32 bits or 64 bits, why
not convert it to a
ns that I think C and Python both
support.
https://www.gta.ufrj.br/ensino/eel878/sockets/htonsman.html
>However, if anyone on this list knows how to pass data from a non-Python
>language to Python in multiprocessing.shared_memory please let me (and the
>list) know.
MMU cach
> On 1 Feb 2022, at 23:40, Jen Kris wrote:
>
> Barry, thanks for your reply.
>
> On the theory that it is not yet possible to pass data from a non-Python
> language to Python with multiprocessing.shared_memory, I bypassed the problem
> by attaching 4 bytes to my F
Barry, thanks for your reply.
On the theory that it is not yet possible to pass data from a non-Python
language to Python with multiprocessing.shared_memory, I bypassed the problem
by attaching 4 bytes to my FIFO pipe message from NASM to Python:
byte_val = v[10:14]
where v is the message
> On 1 Feb 2022, at 20:26, Jen Kris via Python-list
> wrote:
>
> I am using multiprocesssing.shared_memory to pass data between NASM and
> Python. The shared memory is created in NASM before Python is called.
> Python connects to the shm: shm_00 =
> shared_mem
I am using multiprocesssing.shared_memory to pass data between NASM and Python.
The shared memory is created in NASM before Python is called. Python connects
to the shm: shm_00 =
shared_memory.SharedMemory(name='shm_object_00',create=False).
I have used shared memory at other
Hello,
I understand that you want to share data across examples (docstrings)
because you are running doctest to validate them (and test).
The doctest implementation evaluates each docstring separately without
sharing the context so the short answer is "no".
This is a limitation
On Wed, 12 Jan 2022 09:28:16 +1100,
Cameron Simpson wrote:
[...]
> Personally I'd be inclined to put long identical examples in the class
> docstring instead of the method, but that may not be appropriate.
Good point, and perhaps it's best to put a comprehensive example in the
class docstring,
On 11Jan2022 16:09, Sebastian Luque wrote:
>I am searching for a mechanism for sharing data across Examples
>sections
>in docstrings within a class. For instance:
>
>class Foo:
>
>def foo(self):
>"""Method foo title
>
>The e
On Wed, Jan 12, 2022 at 9:11 AM Sebastian Luque wrote:
>
> Hello,
>
> I am searching for a mechanism for sharing data across Examples sections
> in docstrings within a class. For instance:
This seems like trying to cram too much information into the
docstring, but oh well... d
Hello,
I am searching for a mechanism for sharing data across Examples sections
in docstrings within a class. For instance:
class Foo:
def foo(self):
"""Method foo title
The example generating data below may be much more laborious.
1 - 100 of 7010 matches
Mail list logo