Personal tools


From UABgrid Documentation

Revision as of 10:56, 27 July 2011 by (Talk | contribs)

Jump to: navigation, search


Load and Link approach

transfer and uncompress (slow)

  1. login to (linux),
  2. create directory for this data set in your scratch dir
    1. mkdir /lustre/scratch/user/proj1
    2. make sure that directory is readable by galaxy user
      1. Chmod og+x /lustre/scratch/user
      2. Chmod og+x /lustre/scratch/user/proj1
  3. transfer the files with SCP
    1. scp *.fastq.gz
    2. I used "Secure Shell Client" for windows, available at UABIT
    3. open source client is PuTTY
  4. UNCOMPRESS fastq.gz files!!
    1. cd /lustre/scratch/user/proj1
      1. gzip -d filename
      2.  !!WARNING: the following parallel decompress stesp fail for docs over 8G uncompressed!!
      3. find `pwd` -name "*.gz" -exec ksh -c 'qrsh "gzip -d \{}" &' \;
      4. ls -1 *.gz | xargs -L 1 -i_f_ ksh -c 'qrsh -cwd gzip -d _f_ &' \;
  5. make sure the files are readable by galaxy
    1. if you're in galaxy-admin UNIX group you can do
      1. chgrp galaxy-admin *.fastq
      2. chmod g+r *.fastq
    2. if you're not, then you have to make it readable to the world (o=other)
      1. chmod o+r /lustre/scratch/user/proj1/*

link into galaxy dataset (fast)

  1. get Admin privileges on galaxy
    1. either get Shantanu to make you admin in Galaxy
    2. or grab someone who is (John, Curtis)
  2. In Galaxy GUI:
    1. admin > Manage Data Libraries > create new library
      1. add Datasets
        1. Upload option: Upload files from system path
          1. Get a list of absolute path names using one of the following
            1. cd /lustre/scratch/user/proj1 THEN RUN find `pwd` -name "*.fastq"
            2. find /lustre/scratch/user/proj1 -name "*.fastq"
          2. paste list of absolute path names into URL/Text box in Web Admin GUI
        2. Change "Copy data into Galaxy?" to "Link to files without copying into Galaxy"
        3. Put something mnemonic in Message box.

link data into a history (fast)

I could then select the datasets and, at bottom of page "For selected datasets: <Import to histories>" and get them into a history so I can compute on them.