# eml.sync

`replica_mean`

During training, all data (including weights, gradients, metrics) are only available to the current replica. For example, if you calculate the batch loss after every iteration and print it to stdout, you will only view the batch loss of a single replica.

Allreduce allows replicas to communicate. You can calculate the mean across replicas using
`eml.sync.replica_mean`

. Note that this function only accepts a numpy array or a numpy scalar.

#### Calculate the average loss across replicas

#### Calculate the accuracy on your test set

`replica_sum`

During training, all data (including weights, gradients, metrics) are only available to the current replica. For example, if you calculate the batch loss after every iteration and print it to stdout, you will only view the batch loss of a single replica.

Allreduce allows replicas to communicate. You can calculate the sum across replicas using
`eml.sync.replica_sum`

. Note that this function only accepts a numpy array or a numpy scalar.