From BlenderWiki

Jump to: navigation, search
Note: This is an archived version of the Blender Developer Wiki. The current and active wiki is available on wiki.blender.org.

annoyances

  • Xlib: connection to ":0.0" refused by server
Try to delete ~/.Xauthority and restart X server
  • Gnome panel crash
Try to delete ~/.recently-used and restart X server
  • You can found email autoresponder annoying if you are with this username on some mailing list, they won't be happy if they'll receiving your 'kind' responding...

Apache

Proxy

  • Read about proxy setup in Program D.

cron

cron searches its spool area (/var/spool/cron/crontabs) for crontab files (which are named after accounts in /etc/passwd); crontabs found are loaded into memory. Note that crontabs in this directory should not be accessed directly - the crontab command should be used to access and update them.

Minute	Hour	DOM	Month	DOW	Full File Location				
0	3	*	*	*	/home/mysql_backup_day/some_backup.sql	
In this example it is run every day of every week of every month at 3:00AM.
Minute: 0-59 are valid									
Hour: 24 hour clock (army time)								
Day Of Month: 1-31 are valid									
Month: 1-12 are valid (1 is January, 2 is February)						
Day Of Week: 0-6 are valid (0 is Sunday, 1 is Monday)						
Full File Location: Make sure you put the FULL path to the script you're automating.

If you wish to diasble the email (and not output to a log file) then, at the end of each of the cron job lines you wish to not be notified on, place the command:

>/dev/null 2>&1

ffmpeg

How do I encode JPEGs to another format ?

If the JPEGs are named img1.jpg, img2.jpg, img3.jpg,..., use:

 ffmpeg -i img%d.jpg /tmp/a.mpg
`%d' is replaced by the image number.
`img%03d.jpg' generates `img001.jpg', `img002.jpg', etc...

The same system is used for the other image formats.

find

To make tar archive of files that was changed in the last 24h:

tar cvzf ime.tar.gz `find . -mtime 1`

PGP

grep

  • Getting lots of "404" errors for "oldimage.gif" in your site stats? Use grep to find the offending pages:
grep -R "oldimage.gif" /var/www/html

hdup

  • Is there some other way to open all those *.tar.gz files? I can open only first file?
You can just cat the chunks together cat file1 file2 > huge_file.tar.gz and untar that one.

mail

echo Coming home for dinner! | mail sylvia@home.com

Sending HTML

This works for me:

  • mail -a "Content-type: text/html;" -s "Subject Title" some@domain.name < /home/local/web/mail.html
  • You have to send HTML as attachment, some programs can do this from command line, one is mutt (hmm, how to set Content-type: text/html as first attachment?):
$ mutt -a syslogs.tar.gz admin@domain.org < /dev/null

or

$ echo | mutt -a syslogs.tar.gz admin@domain.org

or

$ mutt -s "Birthday celebration" -a citymap.jpg all@friends.org \ < invitation.txt

For more check this URL: http://www.shelldorado.com/articles/mailattachments.html

email autoresponder

  • vacation

Create at your $HOME directory two files: .forward (dot forward) and .vacation.msg (dot vacation.msg)

In file .forward you can put:

username1, username@domain.name, "|/usr/bin/vacation -r 7 username"
Where is username sender and username1, etc. receive notification (bcc) + send to program vacation. Be careful with -r switch, this stands for replies in days. I try with -r 0 and I recieved 28000 replies :/

In file .vacation.msg you can put on example:

From: username@domain.name  
Subject: Re: $SUBJECT
This is just an e-mail receive confirmation. 
You don't need to reply on this message. Thank you.
Where is $SUBJECT content of received subject.

Install Courier

Secure Mail Relaying with Exim and OpenSSL

mplayer

  • mplayer can convert video to animated gif:
mplayer -vo gif89a video.avi

mysql

  • How to see all databases
mysqlshow -p
  • If you get error when dump database like: /usr/bin/mysqldump: Got error: 1016: Can't open file: 'old.MYI'. (errno: 145) when using LOCK TABLES
You can repaire it like this: use the mysql client, use database name then type: repair table old;
and don't forget to make backup before!

netcat

  • <UnNamed>: machineA creates a tar file and pipes to netcat, and machineB has netcat listening in portB and sends the data to tar, which unpacks:
machine A sends data
cd somewhere/ ; tar -cf - * | nc -q 30 machineB portB
machine B receives data
cd destination ; nc -q 5 -l -p portB | tar -xkf -

Ports

  • To check what program using some ports:
netstat -plunt

root-tail

  • /usr/bin/root-tail -g 1260x960+10+10 --wordwrap -font fixed -id `xprop -root XFCE_DESKTOP_WINDOW | cut -d' ' -f5` /var/log/syslog,#B0C4DE,'SYSTEM LOG' &

SSH

  • Part of the ssh protocol makes sure you are really connecting to the machine you think you are, and warns if it notices anything wrong. Although the fixed keys are convenient at times, you should consider generating unique keys. To replace the keys, use these commands to overwrite the keys stored in your nest.
ssh-keygen -t rsa1 -f /etc/ssh/ssh_host_key -N ""
ssh-keygen -t dsa -f /etc/ssh/ssh_host_dsa_key -N ""
ssh-keygen -t rsa -f /etc/ssh/ssh_host_rsa_key -N ""
  • Password-less logins can make working with multiple machines much smoother. The first step is to create a public/private key pair on the local machine with the command:
ssh-keygen -t dsa

Then repeat this procedure starting from your local machine for each remote d:b you want to access:

scp .ssh/id_dsa.pub root@remote-ip-address:.ssh/temp.pub
ssh remote-ip-address
cd .ssh
cat temp.pub >> authorized_keys
rm temp.pub
logout

Note that you will be prompted for a password for the scp and ssh commands.

  • scp, secure copy, allows you to copy files to and from remote machines just as simply as using cp to copy a file locally. Files are encrypted in the transfer to prevent revealing the contents to anyone 'sniffing' on the network. You can specify local or remote files for source or targets in the form username@hostname:filespec. You can omit username if the same username is used on both systems, and you can omit hostname if the file refered to is on the local machine. For example, to copy /etc/rc.local from a remote system to your current directory, you could use a command like:
scp root@192.168.1.2:/etc/rc.local .

tar

  • How to untar bar-package.tar.bz2 ?
tar xjvf bar-package.tar.bz2
  • How to create a gzipped archive, archive.tar.gz?
tar -czvf archive.tar.gz file1 file2 dir/

time

  • Put this line in root crontab for setting system & hw clock, every day @ 5:05 am:
5 5 * * * /usr/sbin/ntpdate ntp1.arnes.si >> /var/log/timelog.log; /sbin/hwclock --systohc

wget

Overview

Wget is a network utility to retrieve files from the Web using http and ftp, the two most widely used Internet protocols . It works non-interactively, so it will work in the background, after having logged off. The program supports recursive retrieval of web-authoring pages as well as ftp sites. You can use wget to make mirrors of archives and home pages or to travel the Web like a WWW robot.

Examples

The examples are classified into three sections, because of clarity. The first section is a tutorial for beginners. The second section explains some of the more complex program features. The third section contains advice for mirror administrators, as well as even more complex features (that some would call perverted).

Simple Usage

  • Say you want to download a URL. Just type:
wget http://foo.bar.com/
  • But what will happen if the connection is slow, and the file is lengthy? The connection will probably fail before the whole file is retrieved, more than once. In this case, Wget will try getting the file until it either gets the whole of it, or exceeds the default number of retries (this being 20). It is easy to change the number of tries to 45, to insure that the whole file will arrive safely:
wget --tries=45 http://foo.bar.com/jpg/flyweb.jpg
  • Now let's leave Wget to work in the background, and write its progress to log file ' log '. It is tiring to type ' --tries ', so we shall use ' -t '.
wget -t 45 -o log http://foo.bar.com/jpg/flyweb.jpg &
  • The ampersand at the end of the line makes sure that Wget works in the background. To unlimit the number of retries, use ' -t inf '.
  • The usage of FTP is as simple. Wget will take care of login and password.
wget ftp://foo.bar.com/welcome.msg

ftp://foo.download.com/welcome.msg

   => 'welcome.msg'
   Connecting to foo.download.com:21... connected!
   Logging in as anonymous ... Logged in!
   ==> TYPE I ... done. ==> CWD not needed.
   ==> PORT ... done. ==> RETR welcome.msg ... done.
  • Download Oracle 9i Manuals to your local directory using CYGWINs wget
wget -q --tries=45 -r \
http://download-east.oracle.com/otndoc/oracle9i/901_doc

Advanced Usage

  • You would like to read the list of URLs from a file? Not a problem with that:
wget -i file

If you specify ' - ' as file name, the URLs will be read from standard input.

  • Create a mirror image of GNU WWW site (with the same directory structure the original has) with only one try per document, saving the log of the activities to ' gnulog ':
wget -r -t1 http://foo.bar.com/ -o gnulog
  • Retrieve the first layer of yahoo links:
wget -r -l1 http://www.yahoo.com/
  • Retrieve the index.html of ' www.lycos.com ', showing the original server headers:
wget -S http://www.lycos.com/
  • You want to download all the GIFs from an HTTP directory. The command 'wget http://host/dir/*.gif ' doesn't work, since HTTP retrieval does not support globbing. In that case, use:
wget -r -l1 --no-parent -A.gif http://host/dir/

It is a bit of a kludge, but it works perfectly. ' -r -l1 ' means to retrieve recursively, with maximum depth of 1. ' --no-parent ' means that references to the parent directory are ignored, and ' -A.gif ' means to download only the GIF files. ' -A " *.gif " ' would have worked too.

  • Suppose you were in the middle of downloading, when Wget was interrupted. Now you do not want to clobber the files already present. It would be:
wget -nc -r http://foo.bar.com/
  • If you want to encode your own username and password to HTTP or FTP, use the appropriate URL syntax:
wget ftp://name:password@foo.bar.com/myfile

Special Usage

  • If you wish Wget to keep a mirror of a page (or FTP subdirectories), use ' --mirror ', which is the shorthand for ' -r -N '. You can put Wget in the crontab file asking it to recheck a site each Sunday:
0 0 * * 0 wget --mirror ftp://x.y.z/pub -o /var/weeklog
  • You may wish to do the same with someone's home page. But you do not want to download all those images, you're only interested in HTML.
wget --mirror -A.html http://www.w3.org/

More Information

You find the sources of wget with all the documentation under the following links

http://www.gnu.org/software/wget/wget.html
http://www.lns.cornell.edu/public/COMP/info/wget/wget_toc.html
http://www.interlog.com/~tcharron/wgetwin.html

xmessage

  • You can use xmessage to put up temporary little notes to yourself, like PostIt notes, only from the commandline:
xmessage "Your message here" &