Adding scheduled tasks to Windows clients with GPO

In this example, I show how to add a scheduled job (taken from the article Shutting down an idle Windows computer) to multiple domain clients, using GPOs.

First, create a batch file (for example in %SystemRoot%\SYSVOL\domainname\scripts) with the following content:


schtasks /Create /RU System /TN "Shut down idle system" /SC ONIDLE /TR "C:\Windows\system32\shutdown.exe /s /f /t 0" /I 20

Open up the Group Policy Management console and add a new GPO. Go to Computer Configuration > Windows Settings > Scripts > Startup and add the newly created batch file. Now you just have to link the GPO to an OU which should be affected.

Windows XP Professional Product Documentation – Schtasks:
http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/schtasks.mspx?mfr=true

Shutting down an idle Windows computer

Shutting down an idle Windows XP computer, for example to save energy costs, can be done through the windows Task Scheduler. Just configure a new task according to the screen shots below (adjusting the parameters for shutdown.exe to your wishes – see the MS support link)

How To Use the Remote Shutdown Tool to Shut Down and Restart a Computer in Windows 2000: http://support.microsoft.com/kb/317371

Upgrading a IBM DS4300 system – A different approach

In this article I write about an alternative way to upgrade a IBM DS4300 SAN system from 73GB fibrechannel to 300GB fibrechannel disks (if you finally were able to get hold of them).

A main objective, besides the added storage capacity, was avoiding downtime of the storage system during the upgrade.
The procedure may sound a bit uncommon and it strongly relies on your confidence to the system and RAID in general, but it worked really smooth.

The upgrade can be done by replacing harddrive by harddrive, each time letting the spare drive jump in and after detecting the new harddrive synchronizing it back. Like many simulated drive failures… I suggest you, to replace only one harddrive per day (takes 14 days in the end).
This procedure avoids downtimes or SAN to SAN copy and only affects the systems performance marginally.
After replacing all harddrives, the system recognizes the added capacity and adds it to the array.

Backing up a remote fileserver with rsync over a ssh tunnel

Our scenario

We want to backup data from our remote host to our backup location.
For this, we use a combination of ssh and rsync.

This guide is held very general. Originally, I set up a secure rsync backup from a Synology NAS at a remote site to a linux server hosted in a DMZ, but it should also work for normal linux to linux box backups.

[] -----rsync over ssh------> []
remote-host                   backup-location

Setting up users and programs

  1. Make sure, you have installed rsync and ssh on both machines
  2. Create a new user on the backup-location (i.e. backupuser) and place his homedrive in /home

Creating SSH trust relationships between the two servers

To be able to schedule a backup job, and avoiding to save the ssh login password somewhere in plain text, we have to build our own small PKI

  1. Create a RSA keypair on the remote-host
    cd /home/USERNAME OR cd /root (if you work as root)
    mkdir .ssh
    cd .ssh

    ssh-keygen -t dsa -b 2048 (you can leave the passphrase empty)
  2. Export the remote-hosts public key to the backup-location
    cd /home/USERNAME OR cd /root (if you work as root)
    mkdir .ssh
    cd .ssh

    If you have previously copied the public key to a usb stick:
    cp /mnt/usb/remote_host.pub /home/USERNAME/.ssh OR /root/.ssh
  3. Tell the backup-locations ssh server that certificate login requests coming from the remote-host are ok
    cd /home/USERNAME/.ssh OR cd /root/.ssh (if you work as root)
    cat remote_host.pub >> authorized_keys
  4. Test the ssh connection from the remote-host to the backup-location
    ssh “backup-location”
  5. Make sure, all keys have restrictive permissions applied to them: Only allow the owner to interact with them (chmod 700)!

Setting up the rsync server infrastructure (on backup-location)

# GLOBAL OPTIONS
log file=/var/log/rsyncd
pid file=/var/run/rsyncd.pid

# MODULE OPTIONS
[backup]
	comment = public archive
	path = /home/backupuser/data
	use chroot = no
	lock file = /var/lock/rsyncd
	read only = no
	list = yes
	uid = backupuser
	ignore errors = no
	ignore nonreadable = yes
	transfer logging = yes
	log format = %t: host %h (%a) %o %f (%l bytes). Total %b bytes.
	timeout = 600
	refuse options = checksum dry-run
	dont compress = *.gz *.tgz *.zip *.z *.rpm *.deb *.iso *.bz2 *.tbz

Hint:
Make sure, the backupuser has the rights to write to the rsyncd- logifile (/var/log/rsyncd)

Testing our rsync tunnel (on remote-host)

rsync -avz -e “ssh -i /root/.ssh/remote_host.priv” /vol/folder backupuser@backup-location::backup OR
rsync -avz -e “ssh -i /home/USERNAME/.ssh/remote_host.priv” /vol/folder backupuser@backup-location::backup

Scheduling the backup job (on remote-host)

Take the command above (from the testing part), paste it into a textfile (put it where you want) and call it rsync_backup.sh (dont forget to chmod +x it afterwards):


#!/bin/sh
rsync -avz -e "ssh -i /home/USERNAME/.ssh/remote_host.priv" /vol/folder backupuser@backup-location::backup

Then, open up your crontab (usually somwhere in /etc) and add the following lines:


#minute hour    mday    month   wday    who     command
0       3       *       *       *       root    
  /PATH-TO-YOUR-SH-FILE/rsync_backup.sh 2>&1 >> /var/log/rsync_backup.log

This will start your backup job every day at 3am.

BackupExec – Media sets vs. library partitioning

Introduction

In this article, I want to give a short overview of the advantages of using partitioning with backup libraries and Symantec’s BackupExec and hopefully bring a bit of transparency in how BackupExec determines, which media it will use for a given backup job.

Scenario

First we assume, we have a small company, which backs up its data every working day of the week – 4 times incremental (Monday, Tuesday, Wednesday, Thursday) and one time a full backup (Friday). The company uses a robotic library with 8 slots. The company wants to make sure, the backup types use different tapes, because the tapes with the Full backups are being stored at a safe location.

First attempt – Media sets

Now, how would you handle the backup plans, to have the incremental and the full backups separated from each other, to be able to archive the full backup tape-sets on a safe location?

You might think, you create two Media sets in the BackupExec Media pane, one named Full and the other named Incremental and then tell the backup jobs to use either Media set… But here is just where the problem starts.
After running the whole thing for some time, you might notice that either your Full or your Incremental Media set is empty and all tapes have shifted to the other media set.

Conclusion and Summary – Library partitioning vs. Media sets

BackupExec doesn’t handle the backup media like you might think, after just looking at it.
Media sets are intended to identify the data, stored on the tapes and not to identify the correct tapes for a job to be executed! To be really sure, to have your Backups separated, you need to enable partitioning on your backup library. Lets say, you configure slot 1 to 4 for the full backups and the slots 5 to 8 for the incremental ones. Works straight-forward, after the partitioning you can select of which partition the backup job should take its tapes.

Links

Creating, configuring, and targeting a backup job to a Robotic Library partition in Backup Exec For Windows Server
http://seer.entsupport.symantec.com/docs/262055.htm

List of ActiveDirectory User Attributes (2000 and 2003)

During my work as a system engineer I often come across situations, where I need to have an easy overview of an ActiveDirectories attribute names. To be a bit more independent of other sites, I decided to start with mirroring an attribute list from the MS KB.

Optional Attributes

accountExpires Value:9223372036854775807
cn (Container) Value:Nirmal
codePage Value:0
countryCode Value:536
displayName Value:Display Name
distinguishedName Value:CN=nirmal,CN=Users,DC=test,DC=local
instanceType Value:4
name Value:nirmal
objectCategory Value:CN=Person,CN=Schema,CN=Configuration,DC
uSNChanged Value:50203
uSNCreated Value:13920
whenChanged Value:2022552554552
whenCreated Value:2022554588585
logonHours Value:://///////////////////////////
userAccountControl Value:524802

Required Attributes

dn Value:CN=nirmal,CN=Users,DC=test,DC=local
objectClass Value:User
sAMAccountName Value:SAMLNAME

Attributes that can’t be imported into AD

badPasswordTime Value:1
badPwdCount Value:1
lastLogoff Value:0932479234902343
lastLogon Value:12924723489374737
logonCount Value:0
primaryGroupID Value:513
pwdLastSet Value:0
sAMAccountType Value:805306368
objectGUID Value::QT2p48fufjweue839384ufufj/A==
objectSid Value::
memberOf Value:CN=Domain Admins,

Other

department Value:GIS
co (Country Name) Value:India
comment
company Value:Computer Sciences Corporation
description Value:Description Field Cost Centre
directReports
lastLogonTimestamp
adminCount Value:1
ADsPath
c (2 digit country) Value:IN
dSCorePropagationData
facsimileTelephoneNumber
givenName
homeDirectory Value:\\amppfilerp01\hthrmg$
homeDrive Value:H:\
homePhone
info (Phone notes)
initials Value:INT
ipPhone
isCriticalSystemObject
l (City) Value:City Field
userCertificate
userParameters
userPrincipalName Value:LogonName@test.local
userWorkstations
wWWHomePage Value:Web Page Field
mail Value:Emailss@sss.com
manager
CN=Users,DC=Local,DC=C
mobile
msNPAllowDialin Value:FALSE
AQUQISJFAAAAAUISIRK@#!$KGFJG(#JFJDJSjs
otherFacsimileTelephoneNumber
otherHomePhone
otherIpPhone
profilePath Value:\\tqchain2k3pc\profiles\nirmal
otherMobile
otherPager
otherTelephone
pager
physicalDeliveryOfficeName Value:Office Name
postalCode Value:Zip Code
postOfficeBox Value:Post Office Box
scriptPath Value:qchain.vbs
servicePrincipalName
showInAdvancedViewOnly
sn (Surname) Value:Last Name Field
st (2 digit State / Province)
streetAddress
telephoneNumber
title
url

Taken from: http://support.microsoft.com/kb/555638
Further informations: http://support.microsoft.com/kb/257218
Information on ActiveDirectory attribute time/date conversion: http://support.microsoft.com/kb/555936

Pause W2K3 SMB shares

Although, there is no clicky way to pause a Windows servers shares, there is a bit more unconvenient way to prevent users from accessing your servers shares:

  • Fire up a registry editor and navigate to HKLM\SYSTEM\CurrentControlSet\Services\LanmanServer\Shares
  • Export all shares you want to reopen again later
  • Delete all shares you want to pause and close the registry editor
  • Restart the service called “Server

et voila, you’re done.

Recursively delete outdated files with VBScript

Recently, I came across the situation, where I had to delete outdated SQL backup files from our MS SQL servers Backup directory. To tidy up the backup folder, I wrote a small VB script which handles this for me:


' DeleteOutdated
'
' Parameters
' MaxFileAge: Maximum file age in days (modification date)
' Path: Folder which contains the files
'
' VBScript ref. http://msdn.microsoft.com/en-us/library/t0aew7h6(VS.85).aspx
' FSO ref. http://msdn.microsoft.com/en-us/library/z9ty6h50(VS.85).aspx

Dim objFso
Set objFso = CreateObject("Scripting.FileSystemObject")

Sub DeleteOutdated(Path, MaxFileAge)
	Set objDir = objFso.GetFolder(Path)
	For Each objFile in objDir.Files
		If objFile.DateLastModified < (Date() - MaxFileAge) Then
			AskDelete = MsgBox("Delete "&objFile.Path&"?",3,"Delete file")
			If AskDelete = 6 Then
				objFile.Delete
			End If
			If AskDelete = 2 Then
				WScript.Quit
			End If
		End If
	Next

	For Each objSubfolder in objDir.Subfolders
		Call DeleteOutdated(objSubfolder, MaxFileAge)
	Next
End Sub

Call DeleteOutdated("c:\outdatedstuff", 1)

Hint
In this state, the script asks to delete everytime, if it finds a file wich is older than specified. Of course, you might want to remove that in productive usage, when script execution is scheduled.

Extended
Here I attached a version, which writes a logfile and processes empty subfolders (updated on 26.08.2009):
deleteoutdated-wlog