Date and time handling in general is a problem in programming. For PHP programmers, there's a good library out there that performs all the difficult tasks and provides convenient APIs. Zend_Date has several constants defined. It is good to know what each one of them represents. For this purpose, I have made a cheat sheet. Download the PDF and enjoy.
The table has the constant name and sample value columns.
Title: Learning Nagios 3.0
Publisher: Packt Publishing
Author: Wojciech Kocjan
Nagios is a powerful and popular network monitoring software. I contacted PackT and asked for a review copy of the book.
Read the full table of contents on the publisher's site.
The topic being system administration, I could read the book from cover to cover in three days.
Like many other technical books, introductory pages of Learning Nagios 3.0 sells you Nagios. The first chapter convinces you why you should use Nagios to make your life easier by setting up automated monitoring of your servers.
Nagios is a popular networking monitoring software. Nagios lets you monitor your IT infrastructure, be it servers, routers, switches or other devices. In this post, I will walk you through the steps of installing and configuring a basic setup of Nagios on CentOS.
At the end of the tutorial you will be have a working Nagios setup to monitor
- local disk space usage
- local system load
- an external website
Whenever there is a problem or recovery with any of the above, Nagios will alert you via email.
Unless you have a strong reason to compile Nagios yourself, you should use the binary packages available for your Linux distribution.
RPMForge provides packages that are not included in CentOS repositories. We will use the Nagios packages from RPMForge.
Enabling RPMForge repository
For 32 bit machines use these commands:
wget http://packages.sw.be/rpmforge-release/rpmforge-release-0.5.1-1.el5.rf.i386.rpm rpm -ivh rpmforge-release-0.5.1-1.el5.rf.i386.rpm
For 64 bit machines use these commands:
wget http://packages.sw.be/rpmforge-release/rpmforge-release-0.5.1-1.el5.rf.x86_64.rpm rpm -ivh rpmforge-release-0.5.1-1.el5.rf.x86_64.rpm
Installing And Configuring Nagios
I'm sharing the slides in this entry.
It was an introductory talk on XML for PHP developers. There are hundreds of technologies built on top of XML. We have all heard about RSS, Atom, XML-RPC, SOAP, etc. The goal of the talk was to get PHP developers to start using XML. In the talk, I presented three recipes:
- Parsing RSS feeds using simplexml. This was based on my blog post about retrieving weather information in your location using Yahoo! Weather.
- Generating Atom feed document using DOM.
- Scraping web pages using DOM and XPath. This was based on Web Scraping With lxml. This time I did in PHP.
A while ago, we discussed how to scrape information from websites that don't offer information in a structured format like XML or JSON. We noted that urllib and lxml are indispensable tools in web scraping. While urllib enables us to connect to websites and retrieve information, lxml helps convert HTML, broken or not, to valid XML and parse it. In this post, I will demonstrate how to retrieve information from web pages that require a login session.
Jamie asks on LinkedIn.
The short answer
The question is wrong.
The long answer
A true PHP developer is a programmer who knows PHP. A false PHP developer is someone who doesn't know PHP. That's the closest correct answer I can think of.
I think, Jamie wants to ask, "what's your definition of a good PHP developer?". There is no correct answer to the question. All, you can do is highlight some of the good things a PHP developer does.
Let's seize this opportunity to talk about the traits of a good PHP developer. Most of the things that apply for a discussion about good PHP programmer also applies to a good web developer and good programmer in general.
The topic was Free Software Movement and GNU/Linux operating system.
It was a long drive to Reva Institute, 40 kilometers from home. I reached the venue in time thanks to the moderate traffic. The third floor was already filled. I had to go to the fourth floor to listen to the speech. The auditorium stage can be viewed from both third and fourth floor. The floor had two elevated blocks, one above the other. There were no chairs on the fourth floor. The floor was a bit dusty. Approximately five hundred people attended the event.
The talk was usual as you would expect. RMS started off, explaining the meaning of free software. The four freedoms. Then he talked about the history of the free software movement, FSF, GNU, Linux, Emacs. Even though I am quite familiar with the topics, it was interesting to hear them from the horse's mouth.
More and more websites are offering APIs nowadays. Previously, we've talked about XML-RPC and REST. Even though web services are growing exponentially there are a lot of websites out there that offer information in unstructured format. Especially, the government websites. If you want to consume information from those websites, web scraping is your only choice.
What is web scraping?
Web scraping is a technique used in programs that mimic a human browsing the website. In order to scrape a website in your programs you need tools to
- Make HTTP requests to websites
- Parse the HTTP response and extract content
- base.mako contains the layout of the web page. Many templates inherit base.mako. Here's a snippet from base.mako
<html> <head> <title>Some title</title> <script>...</script> <script>...</script> </head> </%def>
- my_page.mako inherits base.mako. From within my_page.mako we want to be able to append script tags in the head section of the web page.