Say you have a pretty general Windows server, that runs Microsoft IIS (webserver), MS SQL (database engine) and probably some basic stuff, like SMTP, SSH etc. etc.
Here are some I/O performance hacks I came up with when trying to optimize our Amazon EC2 instances on Windows
You can get a significant performance boost by moving some of your stuff from EBS to "instance-store" (aka "ephemeral storage"). You can move the "temp" folders, TempDB database, page file, IIS logs, mail pickup folders, transaction logs etc. And not just on Windows - on any platform.
Why would you do that? Because EBS drives are slow as fuck are the EC2's biggest performance bottleneck. To put it simple, EBS is just a network drive that lives on different hardware than your server. Amazon keeps coming up with performance optimizations every now and then (like introducing the "Provisioned IOPS" model, "EBS-optimized" network adapters, etc. etc.) but EBS will never be as fast as a local SSD drive that literally lives inside your server, inches away from your CPU. Which is exactly what "instance store" is - a local SSD drive.
The only little downside of using ephemeral drive is... losing all your data forever when your EC2 instance is restarted (as in "stopped and started again", not as in "rebooted Windows"). Simply because after restarting your server - it runs on a different physical machine across the hall. Obviously this means we can move only temporary data that we don't need to persist to an ephemeral disk.
SOME NOTES BEFORE WE START:
This one is obvious. Adjust your environment variables (both user-specific and system-wide) to point to a folder on the instance-store drive.
Create the Z:\TEMP
folder. Then open "Control Panel" - "System and Security" - "Advanced system settings" - "Advanced" - "Environment variables" - and set the TEMP and TMP variables in both "system variables" and "user variables" sections. Make sure you change "user-variables" for all user accounts on your server, including system users like "MSSQSERVER", "SQLSERVERAGENT" etc (because - yes - these accounts write stuff to temp-folders too). Open regedit
, go to HKEY_USERS\%userid%\Environment
for every user, and change the location.
Startup script: the temp folder should be re-created every time you start the server, so make sure you have some .bat file or a startup job in TaskScheduler that creates the folder and grants permissions to, say, "Everyone":
mkdir Z:\TEMP\ icacls Z:\TEMP /grant "Everyone":(OI)(CI)F
The above command creates the folder and grants "Everyone" full-control permissions.
Other notes: no need to create a unique folder for every user since every time a Windows app wants to throw some stuff into a temp folder, it calls the "GetTempFileName" Win32Api function, that generates a unique file name.
MS SQL Server always comes with a "TempDB" system database, that is used for temporary tables, storing intermediate results during sorting and similar disposable stuff. Moving this database can really improve your SQL performance. The database can safely be dropped during reboot, SQL Server will re-create it during startup anyway.
Use this code to move the database files (after granting the required disk permissions, see the "startup script" section)
USE master; GO ALTER DATABASE tempdb MODIFY FILE (NAME = tempdev, FILENAME = 'Z:\tempdb.mdf'); GO ALTER DATABASE tempdb MODIFY FILE (NAME = templog, FILENAME = 'Z:\templog.ldf'); GO --optional for SQL Server 2016 ALTER DATABASE tempdb MODIFY FILE (NAME = temp2, FILENAME = 'Z:\tempdb_mssql_2.ndf'); GO
Where "Z:" is your instance-store drive-letter.
Restart the SQL server and verify the files are moved by executing this script:
SELECT name AS [LogicalName], physical_name AS [Location], state_desc AS [Status] FROM sys.master_files WHERE database_id = DB_ID(N'tempdb'); GO
Startup script: your Z:
drive needs write permissions for the "MSSQLSERVER" user account (or whichever account is used to run the SQL service). Those permissions are lost after restart, so we need to run this during server startup:
icacls Z:\ /grant "NT SERVICE\MSSQLSERVER":(OI)(CI)F
The above command grants "full-control" permissions to MSSQLSERVER account to the whole Z: drive.
Other notes: SQL will re-create the database automatically, just make sure the "startup script" launches BEFORE the SQL-service is started
Lets move the page file to our ephemeral drive as well. On servers with relatively low memory (less than 8GB) this can drastically improve overall performance. Open "Control Panel" - "System and Security" - "Advanced system settings" - "Advanced" - "Performance" - "Virtual memory" - "Change". Select "no paging file" for all drives except the instance-store one.
Be default, ASP.NET uses "Temporary ASP.NET Files" folder to store temporary assemblies after compiling the .aspx/.chtml files. Also this folder is used to temporarily store the files that your users upload to your app (if any). The default location is probably something like C:\Windows\Microsoft.NET\Framework\v4.0.30319\Temporary ASP.NET Files
. This folder is heavily written to, when you, for example, update your application files (and the app gets unresponsive for 10-50 seconds).
Anyway, lets move it to disk Z: as well. Edit the "web.config" for your web application(s):
<compilation debug="false" tempDirectory="Z:\TempAspNetFolder"/>
Startup script we need to pre-create the folder and grant permissions to "IIS_USR" user (I use "Everyone" here, but that's not the best practice).
mkdir z:\TempAspNetFolder icacls Z:\TempAspNetFolder /grant "Everyone":(OI)(CI)F
Other notes: none.
I have a couple of staging databases, a test-database for our helpdesk app's "demo" (who cares if it's recreated from scratch from time to time) etc. etc. Just ask yourself - which data is safe to lose AND/OR has some way of being recreated automatically via a script? I bet you have plenty.
The following steps imply that you really know what you're doing, that you have administrative skills, and that you realize you need to take steps to backup this data, yada-yada-yada, blah-blah, be careful.
If you're running an SMTP service on your windows server, it might be a good idea to move the "mailroot" folder. For example, our helpdesk app sends THOUSANDS of emails every minute and the service is busy as hell, so moving the directories to the SSD drive helped a lot. Here's how you do it.
cd c:\inetpub\adminscripts\ adsutil.vbs set smtpsvc/1/dropdirectory z:\mailroot\drop adsutil.vbs set smtpsvc/1/badmaildirectory z:\mailroot\badmail adsutil.vbs set smtpsvc/1/pickupdirectory z:\mailroot\pickup adsutil.vbs set smtpsvc/1/queuedirectory z:\mailroot\queue
Startup script: we need to create the "mailroot" folder every time we're up after a crash or something.
mkdir z:\mailroot mkdir z:\mailroot\badmail mkdir z:\mailroot\drop mkdir z:\mailroot\pickup mkdir z:\mailroot\queue
Other notes: the startup script does not assign any special permissions since SMTP typically runs under the "local system" account.
Every time someone opens a connection to your HTTP server - a log entry is written. This can have HUGE impact on your performance if you have many clients in your SaaS app and/or many website visitors. Lets move the IIS logs too, if you're fine with losing the logs after a server crash.
To set the IIS logs location open the "IIS management console" and check the "logging" item under all your websites, virtual directories and the root server node too. By the way, I strongly recommend setting up Advanced IIS Logging instead of the built-in one.
Startup script we need to pre-create the log folders and grant permissions.
mkdir z:\iislogs icacls Z:\iislogs /grant "Everyone":(OI)(CI)F
Other notes none.
Open SQL Configuration Manager, right-click the MSSQLSERVER service, select "startup parameters" and choose a folder on ephemeral drive. Same for SQL Server Agent.
Danger zone. Don't do this unless you're a skilled DBA.
Database transaction logs eat up lots and lots of disk i/o operations. Actually, one of the best advice ever given on boosting your SQL performance is "keep data/logs on different drives" (applies to all platforms and DB engines, not just MS). But beware, by moving the logs to ephemeral drive, you might lose the ability to restore database to a certain point in time unless you come up with a solid backup strategy (that's a topic for a whole other blog post). Here's how you move the log:
Use MASTER GO -- Set database to single user mode ALTER DATABASE adventureWorks SET SINGLE_USER GO -- Detach the database sp_detach_db 'AdventureWorks' GO -- Now the database is detached. GO MOVE THE LOG FILE THEN SWITCH BACK TO THIS WINDOW -- Now Attach the database sp_attach_DB 'AdventureWorks', 'C:\PATH_TO_OLD_DATA_FILE\AdventureWorks_Data.mdf', --old location of the data (not logs) 'Z:\AdventureWorks_Log.ldf' --new location of the log GO
Startup script: no startup script needed since we already granted required permissions in Part 1.
Other notes: If the server crashes and the transaction log is lost - there are still ways to rebuild it:
-- run this if the log file is lost ALTER DATABASE AdventureWorks SET EMERGENCY; ALTER DATABASE AdventureWorks SET SINGLE_USER; DBCC CHECKDB (AdventureWorks, REPAIR_ALLOW_DATA_LOSS) WITH NO_INFOMSGS, ALL_ERRORMSGS;
The above script should NOT be a part of your regular backup strategy and "disaster recovery". It's your last resort. Data consistency is not guaranteed after executing it.
Sometimes when you screw things up, your Windows EC2-machine crashes and freezes big time and the only thing that will bring it back to life is to "force-shutdown" it using the AWS console or API. After it goes back up, you might notice that the ephemeral storage is... gone! The whole "disk Z:" is missing from your system. Don't worry, it's there, it just haven't been "attached" to your operating system. The SQL server won't start (because there's no Z: drive), the IIS and SMTP services won't start (because there's no Z: drive), Windows will throw lots of scary error messages (because it was unable to create the page file... because there's no Z: drive!). Calm down. Go to "Computer management" - "Disk management", attach the instance-store, format it, assign the drive letter. Then run your startup script to create all the fodlers, files, assign the right permissions. Reboot Windows. Voi la.
Who says "not"? RDS is a great service. Actually, Amazon RDS are just computing instances with EBS drives, that are managed by Amazon (software upgrades, patches, backups, performance tuning etc) and that are priced slightly higher than regular EC2s.
RDS instances don't have instance-storage. All the data is on EBS. But if you have enough money to buy a "fast enough" RDS, i.e. pay for a setup that will deliver i/o performance similar to these SSD hacks - why not! Actually if you're a funded startup and have $10000/month to spend, you can just skip all these steps completely and throw money into your AWS setup - pay for super fast EBS pIOPS drives, switch to high-memory instances etc. etc.