I myself have been using the backup software BackupPC for almost a decade, and I started to backup our lab computers to a central backup server about two years ago. BackupPC supports deduplication and therefore much data can be stored on a couple of 2 TB drives.
BackupPC supports many protocols (smb, ftp, tar/rsync via ssh), but we mostly use rsync via ssh. The data is not encrypted before the backup. Strangely not even the upcoming version 4 will support pre-egression encryption. However, we store the backup on an encrypted volume. That way it is at least protected if the backup server is stolen. And during transit, the data is protected by ssh. However, the system is not TNO ("trust no-one"), since I (as the backupc administrator) can access the files. When using Linux, one feasible method would be to encrypt the user's home directory using eCryptFS (which is an inbuilt option when creating users on Ubuntu) and then backup the /home/.ecryptfs directory instead of the users home directory. However, recovery would be much more of a problem. You would be able to browse the diectory structure and files of the backup, but the filenames would be meaningless since they are also encrypted in the process.
There is one peculiarity in backing up Mac OSX machines: BackupPC normally connects as root via ssh into the client computer and executes the rsync backup command. In order to make this possible on university-managed Mac OSX computers, we had to create a dedicated user ("backuppc") on the client machines and allow for this user the execution of rsync with root privileges, which is done by adding this line in the /etc/sudoers file:backuppc ALL=NOPASSWD: /usr/bin/rsync
. Then we have to change the ssh/rsync command for the Mac client on the backuppc server changing "root" into "backuppc".