How to approach this question


Consider 4 processes sharing the CPU in round robin fashion.If the context switching time is 1 second ,what must be the maximum time quantum ‘q’ such that the number of context switches are reduced but at the same time each process is guaranteed to get the turn at the CPU for every 10 seconds.


Each process runs for q period and if there are n process: p1, p2, p3, …, pn.
Then p1’s turn comes again when it has completed time quanta for remaining process p2 to pn, i.e, it would take at most (n-1)q time.
So, each process in round robin gets its turn after (n-1)q time when we don’t consider overheads but if we consider overheads then it would be ns + (n-1)q
So we have ns + (n-1)q <= t
here n=4