Skip to main content


Showing posts from March, 2015

Strict IPTables Rules for postgresql server (Configured to make streaming replication)

IPTables rules script for a postgresql server which is configured as a master or a standby for streaming replication.
# IP address of this server
SERVER_IP=$(/sbin/ifconfig -a | awk '/(cast)/ { print $2 }' | cut -d':' -f2 | head -1)

DNS_SERVER=<write IP address of the dns server>
SSH_CLIENT=<write the IP address from where you make ssh connections>
PGE_SERVER=<write IP address of the other postgresql server>

# Flush iptables rules
iptables -F
iptables -X

# Set default filter policy
iptables -P INPUT DROP
iptables -P OUTPUT DROP
iptables -P FORWARD DROP

# Allow traffic on loopback adapter
iptables -A INPUT -i lo -j ACCEPT
iptables -A OUTPUT -o lo -j ACCEPT

# Allow incoming ssh only
iptables -A INPUT -p tcp -s $SSH_CLIENT -d $SERVER_IP --sport 513:65535 --dport 22 -m state --state NEW,ESTABLISHED -j ACCEPT  iptables -A OUTPUT -p tcp -s $SERVER_IP -d $SSH_CLIENT --sport 22 --dport 513:65535 -m state --state ESTABLISHED -j ACCEPT

# Allow …

Preserving Linux Shell History Even If Working with Multiple Terminals

If you are continuously running shell commands on more than one linux terminal, probably you want all of the shell (mostly bash) prompts to remember any command from any terminal. With the following environmental variables to save the .bashrc file, you can do it so.

# This is for ignoring duplicate entries export HISTCONTROL=ignoredups:erasedups
# This is for large history export HISTSIZE=102400
# This is for a big history file export HISTFILESIZE=100000
# This is for appending commands to history file shopt -s histappend
# This is for saving and reloading the history after each command is run export PROMPT_COMMAND="history -a; history -c; history -r; $PROMPT_COMMAND"

Preserving links in Linux

Linux commands like tar and cp have some options that control whether symbolic links are followed or not. When you run tar command which is backing up directories contain multiple links to big files, you would get unnecessary copies of the same data. 
In the case of a cp command if a symbolic link is encountered, the data inside of the file to which the link targets is copied when -L (dereference) option used. But if you use -d (no dereference) option, cp would copy the link itself.
Look at the following example;

Setting Up a Workgroup Directory in Linux

The following procedure may be useful to create workgroup folder for a team of people.
The workgroup name is HR and has some members cbing, mgeller, rgreen The folder is /data/hr Only the creators of files in /data/hr folder should be able to delete them. Members shouldn't worry about file ownership, and all members of the group need full access to files. Non-members should not have access to any of the files.
The followings will match the requirements written above:

Extracting an HTML Page Contents with Python's BeautifulSoup4

BeautifulSoup get_text method can be used for stripping html tags and getting page contents. file is like:  # -*- coding: utf-8 -*-
import sys
import os
from bs4 import BeautifulSoup
import requests
if sys.stdout.encoding is None:
    os.putenv("PYTHONIOENCODING", 'UTF-8')
    os.execv(sys.executable, ['python']+sys.argv)
url = sys.argv[1]
page_content = requests.get(url)
text = BeautifulSoup(page_content.text).get_text()
print text
This python code can be run with command line argument like: # python