RE: 5 Things You … bla bla … PHP Performance

Hey guys, hey Gonzalo!

i’ve read now this very, very long article from Gonzalo Ayuso on DZone about PHP performance.

http://php.dzone.com/articles/5-things-you-should-check-now

Nice … really nice … but!

I think, this article tells you very basic things that are not so important compared to others …
… so: I will tell you the things you should really check now … then what you should do and why … and you don’t have to read much :) Let’s go!

First

  • Yes, you have to measure some things!
  • No, you should not use microtime()! Use Tools!

WHY: You need a history of many metrics (memory, cpu, network, disk io, requests, response time, 404 errors, …), so you need some tools: Use Google WMT, Google Analytics, Webgrind, Cacti, Pingdom, Nagios, Log-Files

Second

  • Make a php-script with <? phpinfo();
  • open it and search for APC
  • if you found nothing, do: “apt-get install php-apc”
  • restart your webserver.

WHY: It’s so easy to set up and it just works, look:

Third

“Cache like there is no tomorrow”? <= Hmmh, maybe … or not!

  1. Your app should work without caching … and it should be fast without caching
  2. Then you think about a good, consistent caching layer … what can be cached? how you can make it without blowing up your code with much caching code? Don’t scatter caching code all over your application … find a central point: HTML rendering, database query caching … are good points.
  3. Implement caching and then activate only on production server … not on dev environment.

WHY: So is caching not required for the health of your application, if you start programming with caching then you usually don’t care so much about performance.

Forth

Yes, Background Task, Worker Threads are great … but …
You must have logs! Log for every task: start, end, result (code) … catch all output and errors … write it to log-file which you can easy view. It’s very helpful if you have a interface for stopping and starting background tasks manually.

WHY: Concurrent code is one of the top challenges in programming, if you don’t know what’s happening … you are doomed! Think of a worker that do some stuff and sends emails to thousands of users … every night. In the morning your boss ask you if everything works fine, and you can say: “yes” … or “no, but i can restart it now at user no. 13456″.

Fifth

Database Indexes … … yes, Gonzalo is right check it now, and don’t forget it if you create new tables.

Last

Death by traffic … a luxury problem that probably will never happen :)

After load testing you know what your servers can handle … but that wont help you … what really helps is a strategy what you do in such a moment. It would be good if you can easy scale your application (database, dynamic content requests, static files requests, …) … that can be very complicate so i would not do it for this off chance.

More likely this will happen: You do your work well and some of your competitors start to monitor your site (new customers on page, new content, …) … and usually they don’t care of your servers and do really bad things for your performance. You should have a possibility to detect and block such requests.

Do it now! That’s my opinion.