I am intel ipp h264 encoder for our application. We are using it for various resolutions with various bitrate.
So, for 320x240 with 100kbps with 30 fps CPU usage is around 20 to 35%
where for 640x480 with 100kps with 30fps CPU usage is around 80 to 100%
As the resolution for 320x240 is 20 to 35%, for double the resolution with same fps and bitrate, the CPU usage also get doubled, but why it is going beyond more than double equals to thriple.
Is this not like resolution is doubled, so CPU also double, is this valid one or expected.
Plz provide the proper explanation.