In this paper, an optimization based modeling and solution framework for inferring gene regulatory networks while accounting for time delay is described. The proposed framework uses the basic linear model of gene regulation. Boolean variables are used to capture the existence of discrete time delays between the various regulatory relationships. Subsequently, the time delay that best fits the expression profiles is inferred by minimizing the error between the predicted and experimental expression values. Computational experiments are conducted for both in numero and real expression data sets. The former reveal that if time delay is neglected in a system a priori known to be characterized with time delay then a significantly larger number of parameters are needed to describe the system dynamics. The real microarray data example reveals a considerable number of time delayed interactions suggesting that time delay is ubiquitous in gene regulation. Incorporation of time delay leads to inferred networks that are sparser. Analysis of the amount of variance in the data explained by the model and comparison with randomized data reveals that accounting for time delay explains more variance in real rather than randomized data.