Performance when creating an image pixel by pixel

PerlMagick is an object-oriented Perl interface to ImageMagick. Use this forum to discuss, make suggestions about, or report bugs concerning PerlMagick.
Post Reply
mad_ady
Posts: 2
Joined: 2012-03-20T23:37:39-07:00
Authentication code: 8675308

Performance when creating an image pixel by pixel

Post by mad_ady » 2012-03-20T23:53:05-07:00

Hello. I am a new PerlMagick user and this is what I would like to accomplish:
* create a png file by knowing width and height (and color depth) and getting all the data serially as RGBA integers.

My plan is to add the pixels one by one until the image is ready, and then write the image to disk.

I've looked through the documentation and I see that I could do this by using something like:

Code: Select all

$image->Draw(fill=>$pixelColor, primitive=>'point', points=>"$x,$y");
... and I would iterate through each line and row doing this.

I have the following concerns/questions:
1. The image I'm drawing will be 1280x720x32bpp. This will take up roughly 3.6MB in RAM. I am working on an embedded system, so I have some memory constraints.
I'd like to know if PerlMagick is doing compression behind the scenes progressively, as I add more pixels (e.g. if it starts to group similar pixels before I finish adding pixels to the image), so that it wouldn't use up 3.6MB of RAM during runtime. Or does it store it as RGBA inside and only compresses it when I write to disk?
2. How costly is it (in terms of CPU cycles) to add each pixel to the image? Would it be more efficient if I added a full row at a time? Can I add multiple pixels at once with one function call?

Thanks for the help. If you have other suggestions on how I could do this more efficiently, please let me know.

User avatar
anthony
Posts: 8883
Joined: 2004-05-31T19:27:03-07:00
Authentication code: 8675308
Location: Brisbane, Australia

Re: Performance when creating an image pixel by pixel

Post by anthony » 2012-03-21T00:45:19-07:00

How do you know when an image is 'ready'?


You may also like to check out the Pixel Enumeration File Format.
This can be read in my ImageMagick, and when reading pixels can be created in any order.
The pixel not specified is set to background color.

http://www.imagemagick.org/Usage/files/#txt


one example was in an example of a 'primative' image distortion technique (using awk to do the rotation mapping of individual pixels)
http://www.imagemagick.org/Usage/distor ... rd_mapping
Anthony Thyssen -- Webmaster for ImageMagick Example Pages
https://imagemagick.org/Usage/

mad_ady
Posts: 2
Joined: 2012-03-20T23:37:39-07:00
Authentication code: 8675308

Re: Performance when creating an image pixel by pixel

Post by mad_ady » 2012-03-21T01:47:25-07:00

Thanks for the suggestion. I didn't know I could load data from a text file.
I suppose you're recommendation is to call the convert binary directly, instead of relying on the perl API, right? This way I could offload the file contents on a filesystem where I have some free space, and call convert to process the data once it is written to file.

Oh, by ready, I meant knowing all the color values for all the pixels. Since I will be getting the data serially, it will take a while until I have all the data. I wondered if the perl API would start compressing the data before I finished collecting all the data (e.g. start compressing the first lines of the file while I add more lines near the end).

Anyway, writing to file and calling convert is a viable solution, because it lifts some of the memory constraints (I suppose convert will use more than 3.6MB during runtime, but I was afraid my process + imagemagick would keep the data twice in their buffers, which would lead to ~7MB of used memory).

User avatar
anthony
Posts: 8883
Joined: 2004-05-31T19:27:03-07:00
Authentication code: 8675308
Location: Brisbane, Australia

Re: Performance when creating an image pixel by pixel

Post by anthony » 2012-03-21T16:43:16-07:00

You can use the API! It was just a suggestion as an alternative. Sometimes it is easier.
At the very least it should give a 'C' code base line as to how fast single pixel (random) updates are.

PerlMagick can also read TXT: images (using normal readimage methods, It can even read images from a previously opened file descriptor using TXT:FD:N when 'N' is the descriptor number.

Feeding the ImageMagick API, while running it at the same time however could prove tricky, and would probably require a separate process (or pipeline).

The API is probably better.


Their are demonstartions of setting single pixels in the sources PerlMagick/demo sub-directory "single-pixels.pl"
while the demo script "pixel-fx.pl" show how you can iterate though all the pixels in sequence.

Unfortunatally I have not yet attempted to figure out the PerlMagick equivelent of 'row by row pixel cache/sync' that is used in the MagickCore, though it should be posible. When I do learn how to do that I would update the "pixel-fx.pl" example to use that method rather than pixel-by-pixel.

Perhaps you know? Or someone else? While I am a perl programmer (even published some CPAN modules), I don't knor the in's and out's of PerlMagick.
Anthony Thyssen -- Webmaster for ImageMagick Example Pages
https://imagemagick.org/Usage/

User avatar
anthony
Posts: 8883
Joined: 2004-05-31T19:27:03-07:00
Authentication code: 8675308
Location: Brisbane, Australia

Re: Performance when creating an image pixel by pixel

Post by anthony » 2012-03-21T16:53:41-07:00

mad_ady wrote:Oh, by ready, I meant knowing all the color values for all the pixels. Since I will be getting the data serially, it will take a while until I have all the data. I wondered if the perl API would start compressing the data before I finished collecting all the data (e.g. start compressing the first lines of the file while I add more lines near the end).
The TXT: method may be useful for reading all the pixels.

Images in memory are stored uncompressed (compile time quantum with a minimum of three channels in IMv6 - IMv7 allows single channel grey images (or up to 32 channel multi-spectrial images) ). But for really large images where you do not want the image all in memory, then you may like to use "stream" for image reading. It reads an image from disk line by line (as the image format allows), and outputs the pixels in a raw image format that perl can read. That is pipe-open the "stream" command (after getting the image size) and read the integer values one group at a time.

The only trouble is I have not seen much about "streaming" writes to a non-raw (raw, PbmPlus, MIFF, etc) image file formats.
Anyway, writing to file and calling convert is a viable solution, because it lifts some of the memory constraints (I suppose convert will use more than 3.6MB during runtime, but I was afraid my process + imagemagick would keep the data twice in their buffers, which would lead to ~7MB of used memory).
Posibly but some streaming methods should help in this matter. I just don't know a lot about them, as yet.

I am wanting to get 'per-image' streaming added to IMv7. That is read one whole image at a time from a stream, for processing very very long video sequences.
Anthony Thyssen -- Webmaster for ImageMagick Example Pages
https://imagemagick.org/Usage/

Post Reply