Digbit – a Bit-shifted automatic domain generation for BitSquatting [python]

BitSquatting is not new but it’s relatively new. This type of attack relies on a random bit that switches from 0 to 1 or from 1 to 0 inside a RAM. Because of this even though you try to access domain like cnn.com, in fact your computer is requesting ann.com. It’s rare but it happens. It can be caused by a cosmic radiation or overheated memory modules. If You would like to learn more I can recommend Artem’s page.

To make it easier to find a domain that is in a single bit-shift distance (as in Hamming distance) I’ve created a script that is generating all the possibilities.

For example lets search for domains (bit-wise) close to cnn.com. The script output will be:

snn.com knn.com gnn.com ann.com bnn.com c.n.com cfn.com cjn.com cln.com con.com cnf.com cnj.com cnl.com cno.com cnn.som cnn.kom cnn.gom cnn.aom cnn.bom cnn.cgm cnn.ckm cnn.cmm cnn.cnm cnn.coe cnn.coi cnn.coo cnn.col

To make it easier to check if particular domain is already registered or not, I’ve made a wrapper script that will execute the python script and for each generated domain it will execute command:

> nslookup domain | grep "NXDOMAIN"

The wrapper script is executed with a single argument that is a domain name. Sample output for twitter.com:

Twitter bitsquatting

Some of those reported as available are obviously false-positive since TLDs like kom don’t really exist. I did not removed them because new TLDs are added from time to time and you might as well have a custom domain setup within your LAN.

The wrapper code:

#!/bin/bash

for i in $( ./digbit.py $1);
do
        dotcheck=$( echo "$i" | grep "\." | wc -l)
        echo -n "$i ";
        check=$(nslookup $i| grep "NXDOMAIN" | wc -l);
        if [[ $check -ne 0 && dotcheck -ne 0 ]];
        then
                echo " < available";
        else
                echo " ";
        fi;
done

The Digbit script code:

#!/usr/bin/env python3
import binascii
import re
import os
import sys

def new_text2bin(mtext):
        result = [ bin(ord(ch))[2:].zfill(8) for ch in mtext ]
        return ''.join(result)

def switch_bit(bit, my_list ):
        result_list = my_list[:]
        if result_list[bit] == '1':
                result_list[bit] = '0'
        else:
                result_list[bit] = '1'
        return result_list

def generate_similar( domain ):
        domains = []
        domain_list = list(domain)
        for i in range(len(domain)):
                print(switch_bit(i, domain_list))
        return domains

def binstr_to_ascii( binstr ):
        binstr = -len(binstr) % 8 * '0' + binstr
        string_blocks = (binstr[i:i+8] for i in range(0, len(binstr), 8))
        string = ''.join(chr(int(char, 2)) for char in string_blocks)
        return string

if len(sys.argv) != 2:
        sys.exit("Not enough args")
else:
        domain=str(sys.argv[1])

        domain_list = list(new_text2bin(domain))

        for i in range(len(domain_list)):
                new_d = (''.join(switch_bit(i, domain_list)))

                new_d_str = binstr_to_ascii(new_d)
                corect_domain = re.match("^(((([a-z0-9]+){1,63}\.)|(([a-z0-9]+(\-)+[a-z0-9]+){1,63}\.))+){1,255}$", new_d_str + "." )

                if ( corect_domain is not None and len(corect_domain.string) > 1 and (corect_domain.string).index(".") < len(corect_domain.string)-1 ):
                        print(new_d_str, end="\n")

        print(" ")

Obrazek

Parsing authentication log with python

This simple script is just an exercise. I’m learning Python and frankly i just like to parse text files.

Code below will parse the /var/log/auth.log file and search for failed authentication attempts. For each failed attempt it will record IP address, date, account used to authenticate and remote port used to authenticate. It will then resolve the IP to hostname, generate list of distinct accounts and ports used by particular IP address. This list will be displayed at the end of script execution. Instead of showing all ports used it shows just the range from the lowest to highest. By default only first five accounts are displayed in the table (unless list of those five is longer than 30 chars – in such case the list is truncated). If you would want to display all accounts recorded you can replace the code in line 176 from this one:

parsed_accounts     = adjust_item( five_accounts,         30 )

to this one:

parsed_accounts     = item["accounts"]

The columns NOFA and NOFP are showing number of accounts and number of ports used respectively. The date showed can be read as ‘last seen’ for particular IP address.

The example output:

auth

The script:

#!/usr/bin/env python3.4

# IMPORTS
import re
import socket
import pprint
from colorama import init, Fore, Back, Style

# VARS
log_path = '/var/log/auth.log'
hosts=[]
full_hosts_data=[]
previous_ip = ""
previous_host = ""

# ADJUSTING TO FIXED LENGTH
def adjust_item( str, i ):
	if len(str) < i:
		for j in range(i-len(str)):
			str = str + " "
	return str

# AS THE NAME SAYS
def get_hostname( ip ):
	global previous_ip
	global previous_host
	if previous_ip == ip:
		return previous_host
	else:
		try:
			new_host = socket.gethostbyaddr(ip)
			previous_ip = ip
			previous_host = new_host[0]
			return new_host[0]
		except Exception:
			new_host = ip
			previous_ip = ip
			previous_host = ip
			return new_host

# RETURNING FIRST FIVE ACCOUNTS AND NUMBER OF ALL ACCOUNTS TRIED
def first_5( parsed_string ):
	result_5 = ""
	count_all = 0
	if len( parsed_string.split("|") ) > 5 :
		index = 5
		for item in parsed_string.split("|"):
			if index > 0 and len(item) > 0:
				result_5 = result_5 + "|" + item
				index = index - 1
			if len(item) > 0:
				count_all = count_all + 1
	else:
		for item in parsed_string.split("|"):
			if len(item) > 0:
				result_5 = result_5 + "|" + item
				count_all = count_all + 1
	return (result_5, count_all )

# CHECKING PORT RANGE AND NUMBER OF PORTS WITH FAILED PASSWORDS
def port_parser( parsed_string):
	smallest = 66000
	largest = -1
	counter = 0
	for port in parsed_string.split("|"):
		if len(port) > 0:
			if int(port) < smallest:
				smallest = int(port)
			if int(port) > largest:
				largest = int(port)
			counter = counter + 1
	return( largest, smallest, counter )

def get_date( my_line ):
	date_words = my_line.split(":")
	date = date_words[0] +":"+ date_words[1] +":"+ ((date_words[2]).split(" "))[0]
	return date

def get_ports( my_line ):
	port_words = my_line.split(" port ")
	port = (port_words[1]).split(" ")
	return port[0]

def get_username( my_line ):
	username_words = my_line.split("invalid user ")
	username = (username_words[1]).split(" ")
	return username[0]

def get_username2( my_line ):
	username_words = line.split("Failed password for ")
	username = (username_words[1]).split(" ")
	return username[0]

def check_distinct(itemlist, my_item):
	item_exists = 0
	my_list = itemlist
	for i in my_list.split("|"):
		if i == my_item:
			item_exists = 1
	if item_exists == 0:
		my_list = my_list + "|" + my_item
	return my_list

# READ FILE
with open(log_path, 'rt') as log:
	text = log.read();

# COLLECTING HOSTS AND IPS
for line in text.split("\n"):
	if len(line) > 5:
		# PARSE LINE AND ADJUST FIELD LENGTH
		check_1 = line.find("cron:session")
		check_2 = line.find("Disconnecting")
		check_3 = line.find("Address")
		if check_1 == -1 and check_2 == -1 and check_3 == -1:
			break_in = line.find("POSSIBLE BREAK-IN ATTEMPT")
			if break_in != -1:
				words = line.split(" [")
				words2 = (words[1]).split("]")
				host = get_hostname( words2[0] )
				exists_check = 0
				for my_host in hosts:
					if my_host["ip"] == words2[0]:
						exists_check = 1
				if exists_check == 0:
					hosts.append({"ip":words2[0], "hostname":host})

for my_host in hosts:
	ports = ""
	accounts = ""
	date = ""

	for line in text.split("\n"):
		# CHECK LINES FOR FAILED PASS ATTEMPTS
		if line.find(my_host["ip"]) != -1 and line.find("Failed password") != -1:

			if line.find("Failed password for invalid ") != -1:
				username = get_username( line ) 				# GET USERNAME
			else:
				username = get_username2( line ) 				# GET USERNAME

			port = get_ports( line ) 							# GET PORT USED
			date = get_date( line ) 							# GET DATE
			ports = check_distinct(ports, port) 				# SAVE ONLY DISTINCT PORTS
			accounts = check_distinct(accounts, username )		# SAVE ONLY DISTINCT ACCOUNTS

	# SAVE ACCTUAL ATTEMPTS
	if len(ports) > 1:
		full_hosts_data.append({
			"ip":my_host["ip"],
			"hostname":my_host["hostname"],
			"accounts":accounts,
			"ports":ports,
			"date":date
		});

# PRINT TABLE HEADERS
print(
	adjust_item("DATE", 16 ),
	adjust_item("IP", 15),
	adjust_item("HOSTNAME", 40),
	adjust_item("ACCOUNTS", 30) + adjust_item("NOFA ", 4),
	adjust_item("PORT RANGE", 12),
	adjust_item("NOFP",5)
)

# GENERATING OUTPUT
# DATE             IP              HOSTNAME                                 ACCOUNTS                      NOFA  PORT RANGE   NOFP
# Jun  2 08:47:37  61.174.51.XXX   XXX.51.174.61.dial.XXX.dynamic.163data   root|admin                    2     2804 ->58246 30

for item in full_hosts_data:

	largest_port, smallest_port, port_count = port_parser(item["ports"])
	five_accounts, account_counter = first_5(item["accounts"])

	parsed_ip 			= adjust_item( item["ip"], 			15 )
	parsed_host 		= adjust_item( item["hostname"] , 	40 )
	parsed_accounts 	= adjust_item( five_accounts, 		30 )
	parsed_acounter 	= adjust_item( str(account_counter), 5 )
	parsed_portrange 	= adjust_item(str(smallest_port), 	 5 ) + "->" + adjust_item(str(largest_port) ,5 )
	parsed_port_count	= adjust_item( str(port_count), 	 5 )
	parsed_date 		= adjust_item( item["date"], 		16 )

	print(
		parsed_date[:16],
		parsed_ip, parsed_host[:40],
		parsed_accounts[1:30],
		parsed_acounter,
		parsed_portrange,
		parsed_port_count
	)

The code above is provided as is. I do not guarantee it will work in Your environment specifically. I’ve tested this on Debian Jessie (testing). Please use it at your own risk.
Obrazek

[Another] Apache access log parser in Python

I must admit – Python is mighty cool. I’ve started learning it yesterday and today I’ve managed to create a simple parser of Apache2 access log – and with colors!

Nothing fancy so i’m just going to drop it here:

 

#!/usr/bin/env python3.4

# IMPORTS
import re
import socket
from colorama import init, Fore, Back, Style

# VARS
regex = '([(\d\.)]+) - - \[(.*?)\] "(.*?)" (\d+) (\d+) "(.*?)" "(.*?)"'
log_path = '/var/log/apache2/access.log'
previous_ip = " "
previous_host = " "

# FUNCTIONS
def adjust_item( str, i ):
	if len(str) < i:
		for j in range(i-len(str)):
			str = str + " "
	return str

def get_hostname( ip ):
	global previous_ip
	global previous_host
	if previous_ip == ip:
		return previous_host
	else:
		try:
			new_host = socket.gethostbyaddr(ip)
			previous_ip = ip
			previous_host = new_host[0]
			return new_host[0]
		except Exception:
			new_host = ip
			previous_ip = ip
			previous_host = ip
			return new_host

# READ FILE
with open(log_path, 'rt') as log:
	text = log.read();

# FOR EACH LINE
for line in text.split("\n"):
	if len(line) > 5:

		# PARSE LINE AND ADJUST FIELD LENGTH
		ip 		= adjust_item( re.match( regex, line ).group( 1 ), 15 )
		hostname 	= adjust_item( str(get_hostname(ip.strip())), 30 )
		date		= (re.match( regex, line ).group( 2 )).split(" ")[0]
		request 	= adjust_item( re.match( regex, line ).group( 3 ), 40 )
		code		= adjust_item( re.match( regex, line ).group( 4 ), 4 )
		size		= adjust_item( re.match( regex, line ).group( 5 ), 8 )
		ref		= adjust_item( re.match( regex, line ).group( 6 ), 30 )
		agent		= adjust_item( re.match( regex, line ).group( 7 ), 3 )

		# HTTP 200 OK
		if code.strip()[0] == "2":

			print( date + " " , end="")
			print( Fore.GREEN + Style.BRIGHT + code[:4] + Fore.RESET + Style.NORMAL, end="" )
			print( ip[:15] + " " , end="")
			print( hostname[:30] + " " , end="")
			print( size[:8] + " " , end="")

			# CHECK IF METHOD USED IS GET | POST
			if request[0] == "G" or request[0] == "P" :
				print( request[:40] + " " , end="")
			else:
				# OTHER METHODS PRINT IN COLOR
				print( Back.BLACK + Fore.RED + Style.DIM + request[:40] + Fore.RESET + Back.RESET + Style.NORMAL + " ", end="" )

			print( ref[:30] + " ", end="")
			print( agent[:3])

		# HTTP 300
		elif code.strip()[0] == "3":
			print( date + " " , end="")
			print( Fore.YELLOW + Style.BRIGHT + code[:4] + Fore.RESET + Style.NORMAL,  end="" )
			print( ip[:15] + " " , end="")
			print( hostname[:30] + " " , end="")
			print( size[:8] + " " , end="")
			print( request[:40] + " " , end="")
			print( ref[:30] + " ", end="")
			print( agent[:3])

		# HTTP 400
		elif code.strip()[0] == "4":
			print( date + " " , end="")
			print( Fore.BLUE + Style.BRIGHT + code[:4] + Fore.RESET + Style.NORMAL,  end="" )
			print( ip[:15] + " " , end="")
			print( hostname[:30] + " " , end="")
			print( size[:8] + " " , end="")

			# CHECK IF METHOD USED IS GET | POST
			if request[0] == "G" or request[0] == "P" :
				print( request[:40] + " " , end="")
			else:
				# OTHER METHODS PRINT IN COLOR
				new_request=Fore.RED + request[:40] + Fore.RESET
				print( new_request + " " , end="")
			print( ref[:30] + " ", end="")
			print( agent[:3])

		# HTTP 500
		elif code.strip()[0] == "5":
			print( date+ " " , end="")
			print( Fore.MAGENTA + Style.BRIGHT + code[:4] + Fore.RESET + Style.NORMAL ,  end="" )
			print( ip[:15] + " " , end="")
			print( hostname[:30] + " " , end="")
			print( size[:8] + " " , end="")
			print( request[:40] + " " , end="")
			print( ref[:30] + " ", end="")
			print( agent[:3])

		# OTHER
		else:
			print( date + " " , end="")
			print( code[:4],  end="" )
			print( ip[:15] + " " , end="")
			print( hostname[:30] + " " , end="")
			print( size[:8] + " " , end="")
			print( request[:40] + " " , end="")
			print( ref[:30] + " ", end="")
			print( agent[:3])

You will see output like this:
log

Mèdved – web-based DNS zone transfer automation

It’s been a while since my last post, so today i have something bigger and – most probably – more usefull than usually. [Download link here]

I present to you Mèdved (bear in serbian language). It is a part of the suite of tools i’m creating thus the main directory is named carnivores.
This is a web-based tool designed to automate the search for domain transfers. It has an intuitive interface and few helpful shortcuts. As an input it expects a domain or list of domains. Ideally the list should be comma-separated but it will handle space or CR-LF separated lists as well. Aside from normal results it gives you a log of performed searches and all successful transfers are archived.

Requirements:

  • Linux + Apache2
  • path to medved.php: /var/www/carnivores/medved/medved.php (simply extract the archive to /var/www/)

There are some requirements for directory structure and permissions so i’ll show you how the tree should look like:

directory tree required

Below is the first page with help toggled:

dns zone transfer Medved tool

It has been implemented with responsive design so you can use it on your smartphone/tablet although the interface becomes slightly denser:

dns zone automation Medved responsive/mobile

You can supply the list as

domain.com, domain2.com, domain3.com.

If you have a list looking like this:

domain.com domain2.com domain3.com

or like this:

domain.com
domain2.com
domain3.com

you can paste it as well, just use the button Spaces to commas, before clicking Analyze button, and the list will be corrected to expected form. If you have an URL list instead of domains – use the button Sanitize URLs and it should strip all the unnecessary stuff from the URL.

This and more about the available functions and shortcuts is described in the help.

Lets see how it works with example:

dns zone automation tool in action

As you can see the transfer for Microsoft is actively refused. All tested NS servers have separate tabs. The warning sign shown for other domain (which i removed from the picture) indicates that server do not accept TCP connections. The OK sign for one of the servers indicates successful transfer.

It is common to frequently test the same domain after some time, to see if new records have been added or if the server configuration has been corrected. That is why all the successful transfers are saved in the archive. An archive is a simple list of available transfer results.

dns zone transfer archive

You can filter the results to show the particular domain only, by clicking on a domain name. The list shows the date of the transfer, records discovered and the link to review the transfer data. If more than one server responded with transfer data for particular domain, the number of records shown will be the sum from all the servers.
This might give you a false idea that you might get 1000 records in the transfer when in fact you received 500 records but from two servers.

If you need unique records just save the file and use the command

cat records.txt | sort -u

I’m not going to post any code in here as it would be really tedious work. Instead you can download all of it using the link below.

Download from here or medved [change to 7z].

As usual i’m not responsible for how you use this tool. This is presented only as a proof of concept. You can use it but you cannot distribute it without my knowledge and explicit consent.

I’ve used a code snippet from stevenlevithan.com for URL parsing and shortcut.js file from openjs.com for creating keyboard shortcuts.

Obrazek

DNS zone transfer script

Notice. I’ve made a web-based version of this script that has more functions and an archive for successful transfers. 

Script automating discovery of name servers allowing zone transfers.

Nothing fancy. Just to make it easier.

The output:

Zone transfer discoveredIf you use the command presented on the bottom of the image above you will get results like this:

Successful zone transfer for example domain


Script:

#!/bin/bash

domains="$1"
data="";

for dnsserver in $(host -t ns "$domains" | cut -d " " -f 4);
do
        # VARIABLES
        str_len=$(echo "$dnsserver" | tr -d " " | wc -c)
        str_len=$(echo "$str_len-2"| bc )
        dns_server=$(echo "$dnsserver" | cut -b "-$str_len")
        zone_data=$(dig axfr "$1" "@$dns_server")

        # CHECKING ZONE TRANSFER
        check=$(echo "$zone_data" | grep "Transfer failed" | wc -l)

        if [[ $check -ne 0 ]];
        then
                echo -e " Transfer \033[31mFAILURE\033[00m at $dns_server"
        else
                echo -e " Transfer \033[32mSUCCESS\033[00m at $dns_server"

                # REMEMBER LAST SUCCESSFUL
                data="$zone_data";
                server="$dns_server"
        fi

done

echo ""
echo " Use command: dig axfr $1 @$server"

# UNCOMMENT THIS IF YOU WANT ZONE DATA OUTPUT
# echo "$data"

Obrazek

4chan board /hr image downloader

[ command line image download manager ]

Two decades ago, browsing the internet via 56kb modem was an agonizing experience when you’ve encountered the webpage rich with pictures. They had to be compressed and of course the compression was lossy.
Now you can download high-resolution pictures with a click of a button and wait only couple of seconds for them to be fully loaded. Bandwidth is not an issue anymore.
What >IS< the issue then? Where to get the really high-resolution pictures (above 1920×1080) on a specific and very narrow topic.
If you like (as do I) old medieval maps, NASA best space pictures, landscape photos or some old paintings that are hard to find in high-resolution, and you will not feel offended with an occasional nudity – then the /hr board at 4chan.com is the place just for you. In there you will find multiple collections of really amazing pictures compiled in a single threads just waiting for you to grab them. Yes – this is 4chan. Famous for being lightly moderated and for postings that are anonymous – as warned before, you might encounter some nudity but i guess this is a price for a top-notch pictures you would have otherwise never found.

The /hr board is a collection of multiple threads containing postings with pictures. While i really like some of them, I’m not a patient person when it comes to the downloading stuff manually by clicking on each and every one of them. Therefore, I’ve created a bash script that will download all the pictures for me automatically. It is fairly easy and it works in a three phases firstly it collects all the links to threads, secondly it parses those threads and isolates the links to images and finally it downloads those images to the directory specified.

While it is capable of downloading at full speed, I’ve limited the parsing of webpages to 10000Bps and downloading the images to 200kbps with curl and wget respectively.
I think it’s a matter of netiquette not to cause an overload for the 4chan servers.

Take a peek at how it looks when executed:

  1. Collecting links to sub-pages:

Collecting links to sub-pages of 4chan2. Collecting links to images:

collecting links to 4chan images3. Downloading images

downloading images form 4chan

 


Functions’ definitions are in separate file below.

Without further ado, here it is:

dwm.sh


#!/bin/bash

 source myfunctions.sh

 ############################
 #        VARIABLES         #
 ############################

 today=`date +%d.%m.%y`
 time=`date +%H:%M:%S`
 cols=$(tput cols)
 lines=$(tput lines)
 download_counter=0;
 curltransfer="100000B"

 margin_offset=0;
 margin_text_offset=2;

 let top_box_t=0;
 let top_box_b=0+5;
 let top_box_l=0+$margin_offset;
 let top_box_r=$cols-$margin_offset;
 let top_box_width=$cols-$margin_offset-$margin_offset

 site="http://boards.4chan.org/hr/"

 if [[ ! -d "./$today" ]]; then  mkdir ./$today; fi

 tput civis
 clear

 draw_top_box;
 draw_bottom_box;
 scrap_links;

 cat links.4chan | sort -u >> uniquelinks.4chan
 if [[ -e links.4chan ]]; then rm links.4chan; fi
 cat uniquelinks.4chan | grep -v "board" | cut -d "#" -f 1 > tmp.4chan
 rm uniquelinks.4chan
 cat tmp.4chan | sort -u >> uniquelinks.4chan
 rm tmp.4chan

 scrap_images;

 cat images.4chan | sort -u >> uniqueimages.4chan
 rm images.4chan

 draw_panel;
 draw_headers;

 download_images;

 tput cup 20 0;

 tput cnorm

And required functions file:
myfunctions.sh

#!/bin/bash

check_image()
{
        echo "1:" $1
        echo "2:" $2
}

draw_headers()
{
        tput cup $(($top_box_t+2)) $(($top_box_l+2));
        echo -en "$EINS\033[1;30m\033[40m#\033[0m";
        tput cup $(($top_box_t+2)) $(($top_box_l+6));
        echo -en "$EINS\033[1;30m\033[40mFILE NAME\033[0m";
}

download_images()
{

        let scroll_lines=$lines-6
        top=4

        tput cup $top 0
        scrolled_lines=1;
        allfiles=`cat uniqueimages.4chan | wc -l`

        index=0

        for i in `cat uniqueimages.4chan`
        do

                filename=`echo $i | cut -d "/" -f 6`
                if [[ $((index%$scroll_lines)) -eq 0 ]];
                then
                        tput cup $top 0
                        for ((j=0; j<$scroll_lines; j++))
                        do
                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
                        done
                        tput cup $top 0
                fi

                echo -ne "\033[s"

#               if [[ $index -gt 999 ]];
#               then
#                       tput cup $top 0
#                        for ((j=0; j<$scroll_lines; j++))
#                        do
#                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
#                        done
#                        tput cup $top 0
#                       let index=1
#                       echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
#               fi
                if [[ $index -lt 10 ]];
                then
                        echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -lt 100 && $index -gt 9 ]];
                then
                        echo -e "  $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -lt 1000 && $index -gt 99 ]];
                then
                        echo -e " $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                elif [[ $index -gt 999 ]];
                then
                        tput cup $top 0
                        for ((j=0; j<$scroll_lines; j++))
                        do
                                echo -e "$EINS\033[32m\033[40m                                                                          \033[0m";
                        done
                        tput cup $top 0
                        echo -ne "\033[s"
                        let index=1
                        echo -e "   $index  $EINS\033[30m\033[47mhttp:$i\033[0m"
                fi

                #DOWNLOADING HERE
                color=1
                size=0
                download_check=`cat ./4chan_download.log | grep $filename | wc -l`
                if [[ $download_check -eq 0 ]];
                then
                        let color=1
                        wget -q --limit-rate=200k -P ./$today http:$i
                        size=`ls -hls 24.09.12/$filename | cut -d " " -f 6`
                        #ls -hls 24.09.12/$filename | cut -d " " -f 5
                        let download_counter=$download_counter+1
                else
                        let color=2
                fi

                echo -ne "\033[u"
                if [[ $index -lt 10 ]];
                then
                        echo -en "   $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                elif [[ $index -lt 100 && $index -gt 9 ]];
                then
                        echo -en "  $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                elif [[ $index -lt 1000 && $index -gt 99 ]];
                then
                        echo -en " $index  $EINS\033[m\033[40mhttp:$i\033[0m"
                        if [[ $color -eq 1 ]];
                        then
                                echo -e  "\t[$EINS\033[32m\033[40m+\033[0m]"
                        else
                                echo -e  "\t[$EINS\033[33m\033[40m*\033[0m]"
                        fi
                fi

                let index=$index+1

                echo -ne "\033[s";
                #draw_bottom_box;
                tput cup $top_box_t $(($top_box_l+20));
                echo -en "$EINS\033[30m\033[47mDOWNLOADED $download_counter/$allfiles\033[0m";
                echo -ne "\033[u";
        done
}

scrap_images()
{
 tput cup $(($top_box_t+5)) $(($top_box_l+1));
 echo -en "$EINS\033[1;30m\033[40mSCRAPING IMAGES\033[0m";
 tput cup $(($top_box_t+5)) $(($top_box_l+20));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 tput cup $(($top_box_t+5)) $(($top_box_l+36));
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

 urls=`cat uniquelinks.4chan | wc -l`
 index=0;

 position=21
 for i in `cat uniquelinks.4chan`;
 do

        let index=$index+1
        tput cup $top_box_t $(($top_box_l+20));
        echo -en "$EINS\033[30m\033[47mSCRAPED $index/$urls\033[0m";

        #HERE GOES THE CODE FOR images/# SCRAPPING
        let left=$position
        tput cup $(($top_box_t+5)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";

        curl -s --limit-rate 10000B http://boards.4chan.org/hr/$i | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e  '/^$/ d' | grep "images" | uniq >> images.4chan

        let position=$position+1
        if [[ $position -eq 36 ]];
        then
                tput cup $(($top_box_t+5)) $(($top_box_l+36));
                echo -en "$EINS\033[1;30m\033[40m]\033[0m";
                let position=21;
                tput cup $(($top_box_t+5)) $((1+$position));
                echo -en "$EINS\033[1;30m\033[40m              \033[0m";
        fi

 done

#CLEAN PROGRESS BAR
 for i in {1..14};
 do
        let left=$((19+$i))
        tput cup $(($top_box_t+5)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m \033[0m";
 done

#MARK AS COMPLETE
 tput cup $(($top_box_t+5)) $(($top_box_l+20+14));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 echo -en "$EINS\033[32m\033[40m+\033[0m";
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

#CLEAN COUNTER
 tput cup $top_box_t $(($top_box_l+20));
 echo -en "$EINS\033[30m\033[47m                                \033[0m";

}

scrap_links()
{
 if [[ -e links.4chan ]];
 then
        rm links.4chan;
 fi
 tput cup $(($top_box_t+4)) $(($top_box_l+1));
 echo -en "$EINS\033[1;30m\033[40mSCRAPING LINKS\033[0m";
 tput cup $(($top_box_t+4)) $(($top_box_l+20));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 tput cup $(($top_box_t+4)) $(($top_box_l+36));
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";

#CLEAN OUTPUT FILE
        if [[ -e links.4chan ]]; then rm links.4chan; fi

#SCRAP THE FIST PAGE
 curl -s  --limit-rate $curltransfer http://boards.4chan.org/hr/ | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e '/^$/ d' | grep "res/" | sort -u >> links.4chan

#SCRAP REST
 for i in {1..15};
 do
        let left=$((20+$i))
        tput cup $(($top_box_t+4)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
        curl -s  --limit-rate $curltransfer http://boards.4chan.org/hr/$i | grep -o '<a .*href=.*>' | sed -e 's/<a /\n<a /g' |  sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e '/^$/ d' | grep "res/" | sort -u  >> links.4chan
 done

#CLEAN PROGRESS BAR
 for i in {1..14};
 do
        let left=$((19+$i))
        tput cup $(($top_box_t+4)) $(($top_box_l+$left));
        echo -en "$EINS\033[1;30m\033[40m \033[0m";
 done

#MARK AS COMPLETE
 tput cup $(($top_box_t+4)) $(($top_box_l+20+14));
 echo -en "$EINS\033[1;30m\033[40m[\033[0m";
 echo -en "$EINS\033[32m\033[40m+\033[0m";
 echo -en "$EINS\033[1;30m\033[40m]\033[0m";
}

function draw_top_box()
{
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $top_box_t $left;
        echo -en "$EINS\033[30m\033[47m \033[0m";
 done

 tput cup $top_box_t $(($top_box_l+2));
 echo -en "$EINS\033[30m\033[47mDWM v.1.0\033[0m";
 tput cup $top_box_t $(($cols-20));
 echo -en "$EINS\033[30m\033[47m$time | $today\033[0m";

 tput cup $lines $cups;

}

function draw_bottom_box()
{
 tput cup $lines $cups;
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $cols $left;
        echo -en "$EINS\033[30m\033[47m \033[0m";
 done

 tput cup $lines $cups;
 echo -en "$EINS\033[30m\033[47m  DOWNLOADED FILES: $download_counter\033[0m";
}

function draw_panel()
{
 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $(($top_box_t+1)) $left;
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
 done

 for (( i=0; i<$top_box_width; i++ ))
 do
        let left=$top_box_l+$i;
        tput cup $(($top_box_t+3)) $left;
        echo -en "$EINS\033[1;30m\033[40m-\033[0m";
 done

 tput cup $(($top_box_t+2)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";
 tput cup $(($top_box_t+2)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";
 tput cup $(($top_box_t+2)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";

 tput cup $(($top_box_t+1)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";
 tput cup $(($top_box_t+1)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $top_box_l;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";
 tput cup $(($top_box_t+3)) $top_box_r;
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+1)) $(($top_box_l+4));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+1)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+3)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m+\033[0m";

 tput cup $(($top_box_t+2)) $(($top_box_r-5));
 echo -en "$EINS\033[1;30m\033[40m|\033[0m";

}

Be aware that all scripts are run at your own risk and while every script has been written with the intention of minimising the potential for unintended consequences, the owners, hosting providers and contributors cannot be held responsible for any misuse or script problems.

Obrazek

HP BSM / HP BAC Business Process Monitor maintenance automation with powershell

For those of You that work with application performance monitoring or synthetic user experience monitoring the name of HP Business Availability Center (BAC – formerly known as Mercury Topaz) or HP Business Service Management (BSM – as it is called nowadays), might sound somewhat familiar. One of the features that this suite includes is Business Process Monitor (sometimes referred to as probe). BPM is a piece of software that is capable of running recorded scripts, the execution of which is measured and reported back to central side server – BAC/BSM. This allows you to diagnose and narrow down the cause of performance drops, network bottlenecks or availability issues both overall and from the specific BPM.

To make it more clear:
Lets say you have an application that is available to your clients all over the world. You need to sustain an availability and performance on a level specified in the SLA. You buy BAC/BSM licence and install Business Process Monitor within your clients’ network infrastructures. This will allow you to track performance of your application from multiple sites (cross-country or across the world). Scripts are commonly recorded with HP LoadRunner and deployed to BPMs and executed. The data reported by Business Process Monitor to central server can show you how long did it take to access your application from specific location, how long did it take for your server to respond, how much time did the SSL handshake consumed, what were the errors encountered during the script execution and what errors did the end user saw when your application had issues. Overall a really, really great tool.

But… at some point when number of Your applications increases and with those the number of BPMs and sustained transactions as well (transaction is a block of code within the script that contains particular actions – for example authentication process, or data submit to web page – and is shown separately on the server side), You will face imminent problem with tracking how much BPMs still work, how much have been shutdown accidentally (or intentionally) by your client or which transactions are missing data and when did they stop reporting. The job of reviewing all of the BPMs scattered across multiple transactions, and multiple applications, and multiple profiles (profile is a group of application or group of scripts) is a tedious and painfully boring. I had to review hundreds of transactions and hundreds of BPMs each month – trust me, there are better ways of spending half day at work. Daily routine kills the joy in you piece by piece.

Once, I had enough. I’ve decided to automate this process.
The purpose: to know when particular Business Process Monitor stopped responding and to know how long particular transaction/script hasn’t been executed or returned data.
The weapon of choice: powershell + BSM OpenAPI.


The example output (as usual sensitive data has been removed). The report is for last three months. BPMs that failed more than two months ago are in magenta, more then month ago – in red, more than week in yellow.

Example output of the BPM availability monitoring script

Notice that you have to supply Your own username and password in the URLs in the script.

The variable $strict set to 1 forces the script to report the Business Process Monitor as faulty only when it stopped responding within all scripts/transactions that has been deployed to it. If it is set to 0, the script will report the Business Process Monitor as faulty even it if stopped executing only one of deployed scripts/transactions.
Both modes are useful in some cases.


Script:


#############################
# BPMs / PROBES MAINTANACE

# VARIABLES
 $storageDir = "D:\Work\tmp"
 $strict = 1;
 # DATES IN SECONDS
 $now = [int][double]::Parse((Get-Date -UFormat %s))
 $weekAgo = [int]($now - (60*60*24*7))
 $monthAgo = [int] ($now - (60*60*24*30))
 $threeMonthsAgo = [int] ($now - (60*60*24*90))
 $origin = New-Object -Type DateTime -ArgumentList 1970, 1, 1, 0, 0, 0, 0

 # NUMBER OF MONTHS - HOW FAR AGO TO REACH FOR DATA [ SHOULD BE EITHER 1 OR 3 ]
 $numberOfMonths = 3

 # TRANSACTIONS ARRAY
 $faultyT = New-Object object[] 150
 $faultyTIndex = 0;

 # PROFILES TABLE
 $faultyP = New-Object object[] 6
 $faultyPIndex = 0;

 # PROBES TABLE
 $faultyPr = New-Object object[] 150
 $faultyPrIndex = 0;

 # EFFECTIVELY FOUR DIMENTIONAL ARRAY OF
 # PROFILES -> TRANSACTIONS -> PROBES -> [PROBE NAME][TIMESTAMP]
 $profileProbes = New-Object object[] 6

 # THREE DIMENTIONAL ARRAY OF
 # PROFILES -> TRANSACTIONS -> [TRANSACTON NAME][TIMESTAMP]
 $profiles = New-Object object[] 6

 # MANUALLY FILLED ARRAY OF PROFILE NAMES
 $profilesNames = New-Object object[] 6
 $profilesNames[0] = "A"
 $profilesNames[1] = "B"
 $profilesNames[2] = "C"
 $profilesNames[3] = "D"
 $profilesNames[4] = "E"
 $profilesNames[5] = "F"

# FUNCTIONS
function downloadTransactions( [string]$profile )
{
  $webclient = New-Object System.Net.WebClient
  $url = "https://BAC.URL/gdeopenapi/GdeOpenApi?method=getData&user=XXX&password=XXX&query=SELECT DISTINCT szTransactionName as transaction, MAX(time_stamp) as time FROM trans_t WHERE profile_name='" + $profile + "' and time_stamp>" + $weekAgo + " &resultType=csv&customerID=1"
  $file = "$storageDir\" + $profile + ".csv"
  $webclient.DownloadFile($url,$file)
}

function downloadProbes( [string]$transaction )
{
  $webclient = New-Object System.Net.WebClient
  $url = "https://BAC.URL/gdeopenapi/GdeOpenApi?method=getData&user=XXX&password=XXX&query=SELECT DISTINCT szLocationName as probe, MAX(time_stamp) as time FROM trans_t WHERE szTransactionName='" + $transaction + "' and time_stamp>" + $monthAgo + " &resultType=csv&customerID=1"
  $file = "$storageDir\" + $transaction + ".csv"
  $webclient.DownloadFile($url,$file)
}

function downloadProbesQuaterly( [string]$transaction )
{
  $webclient = New-Object System.Net.WebClient
  $url = "https://BAC.URL/gdeopenapi/GdeOpenApi?method=getData&user=XXX&password=XXX&query=SELECT DISTINCT szLocationName as probe, MAX(time_stamp) as time FROM trans_t WHERE szTransactionName='" + $transaction + "' and time_stamp>" + $threeMonthsAgo + " &resultType=csv&customerID=1"
  $file = "$storageDir\" + $transaction + ".csv"
  $webclient.DownloadFile($url,$file)
}

function showFaultyProfiles()
{
  for ($i = 0; $i -lt $faultyP.Length; $i++)
  {
    if ( $faultyP[$i].Length -gt 1 )
    {
      Write-host "`t" $faultyP[$i]
    }
  }
}

function showFaultyTransactions()
{
 Write-host -foregroundcolor green "+---------------------+-----//------+";
 Write-host -foregroundcolor green "|  HOURS WITHOUT DATA | TRANSACTION |";
 Write-host -foregroundcolor green "+---------------------+-----//------+";
 for ($i = 0; $i -lt $faultyT.Length; $i++)
 {
   for ($j = 0; $j -lt $faultyT[$i].Length; $j=$j+2)
  {
   $size = 55;
   $tranName = [string]$faultyT[$i][$j]
   $spaces = [int]([int]$size - [int]$tranName.Length)
   for ($k = 0; $k -lt $spaces; $k++)
   {
    $tranName = " " + $tranName;
    }
   $size = 18;
   $tranTime = [string]$faultyT[$i][$j+1]
   $spaces = [int]([int]$size - [int]$tranTime.Length)
   for ($k = 0; $k -lt $spaces; $k++)
   {
    $tranTime = " " + $tranTime;
   }
   Write-host -foregroundcolor green "| " $tranTime "|"  $tranName "|"
   Write-host -foregroundcolor green "+---------------------+-----//----+";
   }
 }
}

function printRow ( [string]$dateString, [string]$diffString,
   [string]$probeString, [string]$transactionString,  [string]$color )
{
  Write-host -nonewline -foregroundcolor green "| "
  Write-host -nonewline -foregroundcolor $color $dateString
  Write-host -nonewline -foregroundcolor green " | "
  Write-host -nonewline -foregroundcolor $color $diffString
  Write-host -nonewline -foregroundcolor green " | "
  Write-host -nonewline -foregroundcolor $color $probeString
  Write-host -nonewline -foregroundcolor green " | "
  Write-host -nonewline -foregroundcolor $color $transactionString
  Write-host -foregroundcolor green " |"
}

function drawLine()
{
  Write-host -foregroundcolor green "+--//--+--//--+--//--+--//--+";
}
function drawHeader()
{
  Write-host -foregroundcolor green "+--//--+--//--+--//--+--//--+";
  Write-host -foregroundcolor green "| LAST SEEN | HOURS LOST | PROBE | LAST TRANSACTION |";
  Write-host -foregroundcolor green "+--//--+--//--+--//--+--//--+";
}

# PROFILES

Write-Host -nonewline -foregroundcolor green " COLLECTING PROFILE TRANSACTIONS `t-"
$progress = "-"
for ($i = 0; $i -lt $profilesNames.Length; $i++)
{
  downloadTransactions ([string]$profilesNames[$i])
  $file = $storageDir + "\" + $profilesNames[$i] + ".csv"
  $tmp = Import-Csv "$file" -header("transaction","time","o")
  $counter = 0;
  $tmp | ForEach-Object
      {
        if ( $_.transaction -ne "transaction" -and
          $_.transaction -ne "" )
        { $counter++ }
      }
  $transactions = New-Object object[] $counter
  $j = 0;
  $tmp | ForEach-Object {
    if ( $_.transaction -ne "transaction" -and
          $_.transaction -ne "" -and
          $_.transaction -ne "The data is empty" )
    {
         $time = [int][double]::Parse(($_.time));
      $pair = New-Object object[] 2
      $pair[0] = $_.transaction;
      $pair[1] = $time
      $transactions[$j] = $pair
      $j++
    }
  }
  $profiles[$i] = $transactions
  if ($progress -eq "-") {Write-host -nonewline "`b\" ; $progress = "\"; continue}
  if ($progress -eq "\") {Write-host -nonewline "`b|"; $progress = "|";continue}
  if ($progress -eq "|") {Write-host -nonewline "`b/"; $progress = "/";continue}
  if ($progress -eq "/") {Write-host -nonewline "`b-"; $progress = "-";continue}
}
Write-host -foregroundcolor green "`b`b`tV"
Write-host -nonewline -foregroundcolor green " COLLECTING TRANSACTIONS' TIMESTAMPS `t-"

for ($i = 0; $i -lt $profiles.Length; $i++)
{
 for ($j = 0; $j -lt $profiles[$i].Length; $j++)
 {
   $diff = 0;
   if ( $profiles[$i][$j].Length -gt 0 )
   {
    # CALCULATE TIME SINCE LAST RESPONSE
    $diff = $now - [int][double]::Parse(($profiles[$i][$j][1]));

    # PARSE TO HOURS
    $diff = $diff / 60 / 60;

    # ROUNDING
    $diff = [int]$diff;

    if ( $diff -gt 25 )
    {
     $transaction = New-Object object[] 2
     $transaction[0] = $profiles[$i][$j][0];
     $transaction[1] = $diff;
     $faultyT[$faultyTIndex] = $transaction;
     $faultyTIndex++;
    }
   }
   else
   {
     $faultyP[$faultyPIndex] = $profilesNames[$i];
     $faultyPIndex++;
   }
   if ($progress -eq "-") {Write-host -nonewline "`b\" ; $progress = "\"; continue}
   if ($progress -eq "\") {Write-host -nonewline "`b|"; $progress = "|";continue}
   if ($progress -eq "|") {Write-host -nonewline "`b/"; $progress = "/";continue}
   if ($progress -eq "/") {Write-host -nonewline "`b-"; $progress = "-";continue}
 }
}

Write-host -foregroundcolor green "`b`b`tV"

Write-host -nonewline -foregroundcolor green " COLLECTING PROBES' TIMESTAMPS `t`t-"

for ($i = 0; $i -lt $profiles.Length; $i++)
{
  $transactionProbes = New-Object object[] 200
  for ($j = 0; $j -lt $profiles[$i].Length; $j++)
  {
   if ( $profiles[$i][$j] )
   {
    $transaction = $profiles[$i][$j][0].Replace('&', '%26')
    $transaction = $transaction.Replace('+', '%2B')
    if ( $numberOfMonths -eq 3 )
    {
     downloadProbesQuaterly ($transaction)
    }
    else
    {
     downloadProbes ($transaction)
    }

    $file = $storageDir + "\" + $transaction + ".csv"
    $tmp = Import-Csv "$file" -header("probe","time","o")
    $counter = 0;
    $tmp | ForEach-Object
         {
           if ( $_.probe -ne "probe" -and $_.probe -ne "" )
           { $counter++; }
         }
    $probes = New-Object object[] $counter
    $k = 0;
    $tmp | ForEach-Object
    {
     if ( $_.probe -ne "probe" -and
               $_.probe -ne "" -and
               $_.probe -ne "The data is empty" )
     {
       $time = [int][double]::Parse(($_.time));
       $probe = New-Object object[] 2
       $probe[0] = $_.probe;
       $probe[1] = $time
       $probes[$k] = $probe
       $k++
     }
    }
    $transactionProbes[$j] = $probes;
   }
    if ($progress -eq "-") {Write-host -nonewline "`b\" ; $progress = "\"; continue}
    if ($progress -eq "\") {Write-host -nonewline "`b|"; $progress = "|";continue}
    if ($progress -eq "|") {Write-host -nonewline "`b/"; $progress = "/";continue}
    if ($progress -eq "/") {Write-host -nonewline "`b-"; $progress = "-";continue}
  }
  $profileProbes[$i] = $transactionProbes;
}

Write-host -foregroundcolor green "`b`b`tV"

Write-host -nonewline -foregroundcolor green " SANITAZING PROBES' TIMESTAMPS `t`t-"

for ($i = 0; $i -lt $profileProbes.Length; $i++)
{
  for ($j = 0; $j -lt $profileProbes[$i].Length; $j++)
  {
    $validation = 0;
     for ($b = 0; $b -lt $faultyT.Length; $b++)
     {
      if ($faultyT[$b])
      {
       if ($profiles[$i][$j])
       {
        if ( $faultyT[$b][0] -eq $profiles[$i][$j][0] )
        {
          $validation = 1
        }
       }
      }
     }

    if ($validation -eq 0)
    {
      for ($k = 0; $k -lt $profileProbes[$i][$j].Length; $k++)
      {
    $diff = $now - [int][double]::Parse(($profileProbes[$i][$j][$k][1]))
    $diff = $diff / 60 / 60;
    $diff = [int]$diff;

    if ($diff -gt 25)
    {
       $containCheck = 0;
      for ($a = 0; $a -lt $faultyPr.Length; $a++)
      {
     if ( $faultyPr[$a] )
     {
      if ( $faultyPr[$a][0] -eq $profileProbes[$i][$j][$k][0] )
      {
       $containCheck = 1;
       if ( $faultyPr[$a][1] -lt $profileProbes[$i][$j][$k][1] )
       {
         $faultyPr[$a][1] = $profileProbes[$i][$j][$k][1]
         $faultyPr[$a][2] = $profiles[$i][$j][0]
       }
       }
      }
    }
    if ( $containCheck -eq 0 )
    {
      $probe = New-Object object[] 3
      $probe[0] = $profileProbes[$i][$j][$k][0]
      $probe[1] = $profileProbes[$i][$j][$k][1]
      $probe[2] = $profiles[$i][$j][0]
      $faultyPr[$faultyPrIndex] = $probe
      $faultyPrIndex++;
    }
   }
  }
  }
  }
  if ($progress -eq "-") {Write-host -nonewline "`b\" ; $progress = "\"; continue}
  if ($progress -eq "\") {Write-host -nonewline "`b|"; $progress = "|";continue}
  if ($progress -eq "|") {Write-host -nonewline "`b/"; $progress = "/";continue}
  if ($progress -eq "/") {Write-host -nonewline "`b-"; $progress = "-";continue}
}

if ($strict -eq 1)
{
  for ( $a = 0; $a -lt $faultyPr.Length; $a++ )
  {
    if ($faultyPr[$a])
    {
    for ($i = 0; $i -lt $profileProbes.Length; $i++)
     {
     for ($j = 0; $j -lt $profileProbes[$i].Length; $j++)
      {
      for ($k = 0; $k -lt $profileProbes[$i][$j].Length; $k++)
      {
        if ( ($profileProbes[$i][$j][$k]) -and ($faultyPr[$a]) )
        {
         if ( $faultyPr[$a][0] -eq $profileProbes[$i][$j][$k][0] )
         {
          if ( $faultyPr[$a][1] -lt $profileProbes[$i][$j][$k][1] )
         {
          $diff = $now - [int][double]::Parse(($profileProbes[$i][$j][$k][1]))
          $diff = $diff / 60 / 60;
          $diff = [int]$diff;
          if ( $diff -lt 25)
          {
            $faultyPr[$a] = $null;
          }
          else
          {
            $faultyPr[$a][1] = $profileProbes[$i][$j][$k][1];
            $faultyPr[$a][2] = $profiles[$i][$j][0];
          }
         }
         }
        }
      }
     }
     }
    }
  }
}

Write-host -foregroundcolor green "`b`b`tV"

# PRINTING INFO
Write-host -foregroundcolor green "`n PROFILES MISSING DATA: ";
  showFaultyProfiles;

Write-host -foregroundcolor green "`n UNRESPONSIVE TRANSACTIONS: ";
  showFaultyTransactions;

Write-host -foregroundcolor green "`n INACTIVE PROBES: ";
  drawHeader;

# SORTING TABLE
$faultyPr = $faultyPr | sort-object @{Expression={$_[1]}; Ascending=$true}

# PRINTING INACTIVE PROBES
for ( $i = 0; $i -lt $faultyPr.Length; $i++ )
{
  if ($faultyPr[$i])
  {
   $whatIWant = $origin.AddSeconds($faultyPr[$i][1]);
   $size = 21;
   $dateString = [string]$whatIWant
   $spaces = [int]([int]$size - [int]$dateString.Length)
   for ($j = 0; $j -lt $spaces; $j++)
   {
     $dateString = $dateString + " ";
   }

   $diff = $now - [int][double]::Parse(($faultyPr[$i][1]))
   $diff = $diff / 60 / 60;
   $diff = [int]$diff;
   $size = 12;
   $diffString = [string]$diff
   $spaces = [int]([int]$size - [int]$diffString.Length)
   for ($j = 1; $j -lt $spaces; $j++)
   {
     $diffString = " " + $diffString;
   }

   $size = 25;
   $probeString = [string]$faultyPr[$i][0]
   $spaces = [int]([int]$size - [int]$probeString.Length)
   for ($j = 1; $j -lt $spaces; $j++)
   {
     $probeString = " " + $probeString;
   }
   $size = 45;
   $transactionString = [string]$faultyPr[$i][2]
   $spaces = [int]([int]$size - [int]$transactionString.Length)
   for ($j = 1; $j -lt $spaces; $j++)
   {
     $transactionString = " " + $transactionString;
   }
   if ( $diff -gt 1440 )
   {
     printRow ($dateString) ($diffString) ($probeString) ($transactionString) ("Magenta")
   }
   elseif ( $diff -gt 720 )
   {
     printRow ($dateString) ($diffString) ($probeString) ($transactionString) ("Red")
   }
   elseif ( $diff -gt 168 )
   {
     printRow ($dateString) ($diffString) ($probeString) ($transactionString) ("Yellow")
   }
   else
   {
     printRow ($dateString) ($diffString) ($probeString) ($transactionString) ("Green")
   }
    drawLine;
 }
}
####################

Obrazek