These days, There are several applications where we face with large data set and it has become an important part of common resources in different scientific areas. In fact, there are many applications where there are literally huge amount of information handled either in terabyte or in petabyte. Many scientists apply huge amount of data distributed geographically around the world through advanced computing systems. The huge volume data and calculations have created new problems in accessing, processing and distribution of data. The challenges of data management infrastructure have become very difficult under a large amount of data, different geographical spaces, and complicated involved calculations. Data Grid is a remedy to all mentioned problems. In this paper, a new method of dynamic optimal data replication in data grid is introduced where it reduces the total job execution time and increases the locality in accessibilities by detecting and impacting the factors influencing the data replication. Proposed method is composed of two main phases. During the first phase is the phase of file application and replication operation. In this phase, we evaluate three factors influencing the data replication and determine whether the requested file can be replicated or it can be used from distance. In the second phase or the replacement phase, the proposed method investigates whether there is enough space in the destination to store the requested file or not. In this phase, the proposed method also chooses a replica with the lowest value for deletion by considering three replica factors to increase the performance of system. The results of simulation also indicate the improved performance of our proposed method compared with other replication methods represented in the simulator Optorsim.