new idea about jpeg encode and decode

new idea about jpeg encode and decode

To reduce the use of cpu, Can I use cpu and gpu together to achivejpeg encode and decode , can this ideafeasible? IS jpeg encode and decode time-consuming can be reduce??

publicaciones de 7 / 0 nuevos
Último envío
Para obtener más información sobre las optimizaciones del compilador, consulte el aviso sobre la optimización.
Imagen de vladimir-dudnik (Intel)
Best Reply

When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.

Regards,
Vladimir

Quoting - Vladimir Dudnik (Intel)
When you consider heterogeneous model (when part of your algorithm run on cpu and other part is running on gpu) you need to take into account the overhead of exchanging data between those two parts. In gpu case you have to send data through PCIe bus which may cost too much in overall application performance.
For such a simple algorithms like JPEG it may give no benefits while for some other, computationally extensive application it may provide benefits.

Regards,
Vladimir

Do you mean this method can'tsave time for JPEG Encode or Decode? I have made a sample to do this, but it cost too much time to send data between GPU and CPU, I don't known whetheror not to continue research? or give up this idea?

Imagen de vladimir-dudnik (Intel)

Well you to decide to continue with this or not. I can guess the only way which probably might make sense with existing GPU (which can't implement whole JPEG algorithm) is something like doing huffman decoding on cpu and send decoded coefficients to GPU where you may try to do IDCT, color conversion and display.

Vladimir

Quoting - Vladimir Dudnik (Intel)
Well you to decide to continue with this or not. I can guess the only way which probably might make sense with existing GPU (which can't implement whole JPEG algorithm) is something like doing huffman decoding on cpu and send decoded coefficients to GPU where you may try to do IDCT, color conversion and display.

Vladimir

Yes,you are right. but this is only parts of my work,I also need encode JPEG,include doing color conversion,DCT,quantitative onGPU and send encoded data to CPU to do huffman encode. Which need exchange data two times, it cost too much time.

aslo,I can't make GPU and CPUdo work Synchronizationly,so .... can'tsave time .

Imagen de vladimir-dudnik (Intel)

I think with future Intel Larrabee architecture you will be able to implement huffman encoding natively on GPU. In this case you may put whole JPEG algorithm to GPU. Stay tuned.

Regards,
Vladimir

Quoting - Vladimir Dudnik (Intel)
I think with future Intel Larrabee architecture you will be able to implement huffman encoding natively on GPU. In this case you may put whole JPEG algorithm to GPU. Stay tuned.

Regards,
Vladimir

Thanks a lot, If It can be able to implement huffman encoding on GPU , the jpeg encode and decode time-consuming can be reduce.

Inicie sesión para dejar un comentario.