There is a change of 1 mile for every 2 feet per second (fps) change in velocity when you are below a 500 mile altitude above Earth. A shuttle is at 220 miles above the Earth at its apogee and 210 miles above the Earth at its perigee. The shuttle needs to drop its perigee to 60 miles above the Earth. What is the change in velocity the shuttle needs to make in order to get to the 60 mile altitude
The actual apogee and perigee velocities of the 220/210 mile altitude orbit are 25,230 fps and 25,291 fps respectively.
The actual apogee and perigee velocities of a 220/60 mile altitude orbit are 24,998 fps and 25,992 fps respectively.
Therefore, the shuttle must reduce its apogee velocity by 232 fps in order to drop the perigee altitude to 60 miles and a velocity of 25,992 fps..
Your statement "There is a change of 1 mile for every 2 feet per second (fps) change in velocity..." is somewhat vague. Are you implying that
1) there is a change of 1 mile in perigee altitude for every 2 fps velocity, increase or decrease, at perigee
2) there is a change of 1 mile in perigee altitude for every 2 fps velocity, increase or decrease, at apogee?
Assuming that you meant that the perigee altitude would drop 1 mile for every 2 fps "reduction" in apogee velocity, the apogee velocity would require a delta velocity decrease of (210 - 60)2 = 300 fps.
If your statement "There is a change of 1 mile for every 2 feet per second (fps) change in velocity..." meant something else, please clarify.