In grid computing, agent-based load balancing is one of the most important problems. In this paper, we present a macroscopic model to describe the dynamics of agent-based load balancing with time delays. We concern the number and size of teams where tasks queue. The time gap, during which a single agent searches a suitable node and transfers a task to the node, is incorporated into balancing process as delay. Our model is composed of functional differential equations. By numerical simulations, we show that variables (the number and size of teams, etc.) in the model remain nonnegative, which is in agreement with the physical background of the variables. We show that although there is a period of oscillation, the dynamic behavior tends to a steady state, which is in agreement with the recent experiments on Anthill. An interesting phenomenon is shown: the larger the delay, the longer the period of oscillation, and the slower the converging speed of load balancing.