In: Computer Science
32) Memory Allocation Failure
Realistically, what are the chances of a new (or new[ ]) operation failing in a present-day operating environment? There are at least three possible scenarios to consider: workstation multi-user embedded
Workstation: It is very unlikely for the new operation to fail in a workstation. Workstations are supposed to handle a lot of tasks(hence higher memory requirements) but to ensure smooth operation we also make sure we have ample amount of main memory for that.
Multi-user: In case of multi user systems where multiple users can use a system parallely over a network, we would naturally have a higher main memory to handle parallel use, the likeliness for a memory allocation to fail would be dependent of the number of users using that system. Also note that in today's system with the help of paging we have a large virtual address space and memory allocation won't be a problem as pages can be swapped to disk to ensure correct operation.
Embedded: In case of embedded systems, a very small amount of main memory is present hence it is possible for a heap/dynamic memory allocation(using new) to fail. Another reason for having less memory is that the addressable memory is suually less in embeeded systems as embedded systems prefer memory mapped IO(data bus is faster in embedded systems and sometimes IO bus is not present) hence the addressable memory is lesser (on already smaller main memory).