Here's a common question for newcomers to mobile game development, web game development, and similar systems:
How shall I design my application for scalability? And what do you mean by application, for that matter? Do you mean the whole iOS app/PHP/mySql system, or something else?
A typical web scalability stack looks like:
Now, that being said, how you structure the data in the database matters. If you do "JOIN" between two tables, then those tables need to live on the same database server. Thus, if you do a "JOIN" between two separate players, then all players need to live on a single database server for that to work. You want to spread the data such that it doesn't need JOINs most of the time.
For example, a player, their login history, their inventory, and stats, can all live on a particular server for that player. You then allocate players to database servers in some even fashion. For example, players with player id 1,000,000 - 1,999,999 go to server 1, players with player id 2,000,000 - 2,999,999 go to server 2, and so on. (Don't allocate new player IDs sequentially, but rather based on what database server is least loaded right now to put new players on the lowest-loaded server.) Each such database is called a "shard" and the concept is called "horizontal sharding."
Then, when you have operations that absolutely need transactions across tables, put only that data on a separate server. For example, you may have a trade system. To implement trace, objects need to be transactionally moved to the trade system (typically using a transfer queue on each player shard) where the trade is actually settled. Allocating IDs, and auditing things in order, and accounting for failure along each step is important to avoid item duplication bugs or lost items. Similarly, high scores are typically put on a system of its own, where all player scores can be sorted. Because that system doesn't deal with other player things (login/password/inventory/etc) that system will scale further before it runs out of capacity.
However: I really think you'll do fine on Amazon AWS with load balancing, some number of application servers (start with 2, so if one dies, there's another one still running,) and the Amazon RDS for MySQL. You can go very, very, far on that setup, unless your game is crazy and writes the player state to the database every second or somesuch.
Just make sure you don't store persistent data on the local disk of an Amazon EC2 instance as those are ephemeral, and will go away at times.
Separately, real-time simulations need another structure (crossbar, shared bus, or similar) and player-to-player chat also needs another structure, because they don't scale well in the "stateless app server" model -- because they aren't stateless. If your server needs simulation, you'll need to take another tack for that. But it doesn't sound like that's what you're doing.
The other good news is that you can start with a single server instance, running both app server and database, without load balancing. Store the database back-end on the Elastic Block Store (so it doesn't go away when the instance dies,) and if your game takes off, you can move the data and code to a larger number of servers.