Just measure the voltage not the amperage. you are only supposed to measure amperage (current) when you have a load attached. Just attaching the multimeter across the transformer to measure current is the wrong way and you will get useless results.You put the ammeter in series between the transformer and the load. The load you attach will draw only the current required for it to run. it will not draw the 4 or the 3.2 amps you measured. The current ratings on a transformer are only there to say "be sure the load you attach don't use more current than this".
Ideally the center tap should be exactly half the voltage as the two ends. But, in the real world the voltage between one side and the centertap will be slightly different than the other side to the centertap. But it is not much to make any real difference.
If the transformer has any markings or numbers on it you can do a search on yahoo or google with those to get its specifications on a data sheet. If it doesn't have these, see if it has any power ratings written on it . It might be something like 300W for 300 watts or maybe 100VA for 100 VoltAmps (different than watts). These specifications are saying that the transformer can only supply up to this much to the load. From this specification and a little math you can get the maximum current you can run through there. Which you might not even have to do if the load you attach uses only a little current.
If this is all new to you I recommend studying a little more before you proceed with your transformer project.