Compare commits
2 Commits
Author | SHA1 | Date |
---|---|---|
Matteo Cypriani | bf7e090fbf | |
Matteo Cypriani | 20baa5ecd9 |
|
@ -1,5 +0,0 @@
|
||||||
*.pyc
|
|
||||||
*.pyv
|
|
||||||
*.sw*
|
|
||||||
*.tar*
|
|
||||||
*.egg
|
|
|
@ -0,0 +1,102 @@
|
||||||
|
gcp v0.1.3
|
||||||
|
(c) Jérôme Poisson aka Goffi 2010, 2011
|
||||||
|
|
||||||
|
gcp (Goffi's cp) is a files copier.
|
||||||
|
|
||||||
|
|
||||||
|
** LICENSE **
|
||||||
|
|
||||||
|
gcp is free software: you can redistribute it and/or modify
|
||||||
|
it under the terms of the GNU General Public License as published by
|
||||||
|
the Free Software Foundation, either version 3 of the License, or
|
||||||
|
(at your option) any later version.
|
||||||
|
|
||||||
|
gcp is distributed in the hope that it will be useful,
|
||||||
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
GNU General Public License for more details.
|
||||||
|
|
||||||
|
You should have received a copy of the GNU General Public License
|
||||||
|
along with gcp. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
** WTF ? **
|
||||||
|
gcp is a file copier, loosely inspired from cp, but with high level functionalities like:
|
||||||
|
- progression indicator
|
||||||
|
- gcp continue copying even when there is an issue: he just skip the file with problem, and go on
|
||||||
|
- journalization: gcp write what he is doing, this allow to know which files were effectively copied
|
||||||
|
- fixing names to be compatible with the target filesystem (e.g. removing incompatible chars like "?" or "*" on vfat)
|
||||||
|
- if you launch a copy when an other is already running, the files are added to the first queue, this avoid your hard drive to move its read/write head all the time
|
||||||
|
- files saving: you can keep track of files you have copied, and re-copy them later (useful when, for example, you always copy some free music to all your friends).
|
||||||
|
- gcp will be approximately option-compatible with cp (approximately because the behaviour is not exactly the same, see below)
|
||||||
|
|
||||||
|
/!\ WARNING /!\
|
||||||
|
gcp is at an early stage of development, and really experimental: use at your own risks !
|
||||||
|
|
||||||
|
** How to use it ? **
|
||||||
|
Pretty much like cp (see gcp --help).
|
||||||
|
Please note that the behaviour is not exactly the same as cp, even if gcp want to be option-compatible. Mainly, the destination filenames can be changed (by default, can be deactivated).
|
||||||
|
gcp doesn't implement yet all the options from cp, but it's planed.
|
||||||
|
|
||||||
|
** journalizaion **
|
||||||
|
The journal is planed to be used by gcp itself, buts remains human-readable. It is located in ~/.gcp/journal
|
||||||
|
|
||||||
|
3 states are used:
|
||||||
|
- OK means the file is copied and all operation were successful
|
||||||
|
- PARTIAL means the file is copied, but something went wrong (e.g. changing the permissions of the file)
|
||||||
|
- FAILED: the file is *not* copied
|
||||||
|
|
||||||
|
after the state, a list of things which went wront are show, separated by ", "
|
||||||
|
|
||||||
|
** What's next ? **
|
||||||
|
|
||||||
|
Several improvment are already planed
|
||||||
|
- copy queue management (moving copy order)
|
||||||
|
- advanced console interface
|
||||||
|
- notification (xmpp and maybe mail) when a long copy is finished
|
||||||
|
- retry for files which were not correctly copied
|
||||||
|
- badly encoded unicode filenames fix
|
||||||
|
- file copy integrity check
|
||||||
|
|
||||||
|
... and other are with a "maybe"
|
||||||
|
- graphic interface
|
||||||
|
- desktop (Kde, Gnome, XFCE, ...) integration
|
||||||
|
- distant copy (ftp)
|
||||||
|
- basic server mode, for copying files on network without the need of nfs or other heavy stuff
|
||||||
|
|
||||||
|
** Credits **
|
||||||
|
|
||||||
|
A big big thank to the authors/contributors of...
|
||||||
|
|
||||||
|
progressbar:
|
||||||
|
gcp use ProgressBar (http://pypi.python.org/pypi/progressbar/2.2), a class coded by Nilton Volpato which allow the textual representation of progression.
|
||||||
|
|
||||||
|
GLib:
|
||||||
|
This heavily used library is used here for the main loop, event catching, and for DBus. Get it at http://library.gnome.org/devel/glib/
|
||||||
|
|
||||||
|
DBus:
|
||||||
|
This excellent IPC is in the heart of gcp. Get more information at www.freedesktop.org/wiki/Software/dbus
|
||||||
|
|
||||||
|
python and its amazing standard library:
|
||||||
|
gcp was coded quickly for my own need thanks to this excellent and efficient language and its really huge standard library. Python can be download at www.python.org
|
||||||
|
|
||||||
|
If I forgot any credit, please contact me (mail below) to fix it.
|
||||||
|
|
||||||
|
Big thanks to contributors and package mainteners
|
||||||
|
|
||||||
|
** Contributions **
|
||||||
|
2011: Thomas Preud'homme <robotux@celest.fr>: manpage, stat resolution fix
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
** Contact **
|
||||||
|
|
||||||
|
You can contact me at goffi@goffi.org .
|
||||||
|
You'll find the latest version on my ftp: ftp://ftp.goffi.org/gcp, or check the wiki ( http://wiki.goffi.org/wiki/Gcp )
|
||||||
|
Please report any bug on http://bugs.goffi.org
|
||||||
|
You can also have a look to my other main projects (and maybe to the smaller ones too ;) ):
|
||||||
|
- lm (list movie): a tool to list movies using IMdB data, loosely inspired from ls
|
||||||
|
- SàT: my main project, a jabber/XMPP client, which is a brick to many others things I have in mind
|
||||||
|
|
||||||
|
Don't hesitate to give feedback :)
|
142
README.md
142
README.md
|
@ -1,142 +0,0 @@
|
||||||
gcp
|
|
||||||
===
|
|
||||||
|
|
||||||
gcp (Goffi's cp) is a file copier.
|
|
||||||
|
|
||||||
|
|
||||||
License
|
|
||||||
=======
|
|
||||||
|
|
||||||
gcp is free software: you can redistribute it and/or modify it under the terms
|
|
||||||
of the GNU General Public License as published by the Free Software Foundation,
|
|
||||||
either version 3 of the License, or (at your option) any later version.
|
|
||||||
|
|
||||||
gcp is distributed in the hope that it will be useful, but WITHOUT ANY
|
|
||||||
WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
|
|
||||||
PARTICULAR PURPOSE. See the GNU General Public License for more details.
|
|
||||||
|
|
||||||
You should have received a copy of the GNU General Public License along with
|
|
||||||
gcp. If not, see <http://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
|
|
||||||
About
|
|
||||||
=====
|
|
||||||
|
|
||||||
gcp is a file copier, loosely inspired from cp, but with high level
|
|
||||||
functionalities such as:
|
|
||||||
|
|
||||||
- **Progress bar.**
|
|
||||||
- gcp **keeps copying** even when there is an issue: it just skips the file,
|
|
||||||
logs an error and goes on.
|
|
||||||
- **Logging**: gcp writes what it's doing to a log file; this allows you to
|
|
||||||
know which files were effectively copied.
|
|
||||||
- **Fixing file names** to be compatible with the target filesystem (e.g.
|
|
||||||
removing incompatible chars like `?` or `*` on FAT).
|
|
||||||
- **Queue**: if you launch a copy when another copy is already running, the
|
|
||||||
files are added to the first queue; this optimizes hard drive head movement
|
|
||||||
and filesystem fragmentation.
|
|
||||||
- **Files saving**: you can keep track of the files you have copied, and copy
|
|
||||||
them again later (useful when, for example, you copy some free music to your
|
|
||||||
friends on a regular basis).
|
|
||||||
- gcp will be **approximately option-compatible with cp** (approximately
|
|
||||||
because the behaviour is not exactly the same, see below).
|
|
||||||
|
|
||||||
**WARNING**: gcp is at a relatively early stage of development, use at your own
|
|
||||||
risks!
|
|
||||||
|
|
||||||
|
|
||||||
How to use it?
|
|
||||||
==============
|
|
||||||
|
|
||||||
Pretty much like cp (see `gcp --help`).
|
|
||||||
|
|
||||||
Please note that the behaviour is not exactly the same as cp's, even if gcp
|
|
||||||
aims to be option-compatible. Mainly, the destination filenames can be modified
|
|
||||||
(cf. the `--fs-fix` option).
|
|
||||||
|
|
||||||
gcp doesn't implement all the options from cp yet, but it's a long-term goal.
|
|
||||||
|
|
||||||
|
|
||||||
Logging
|
|
||||||
=======
|
|
||||||
|
|
||||||
The log file is aimed to be used by gcp itself, buts remains human-readable. It
|
|
||||||
is located in `~/.gcp/journal`.
|
|
||||||
|
|
||||||
3 states are used:
|
|
||||||
- **OK** means the file was copied and all operation were successful.
|
|
||||||
- **PARTIAL** means the file was copied, but something went wrong (file
|
|
||||||
permissions could not be preserved, file name had to be changed, etc.).
|
|
||||||
- **FAILED**: the file was *not* copied.
|
|
||||||
|
|
||||||
After the state, a list of things that went wrong is shown, separated by ", ".
|
|
||||||
|
|
||||||
|
|
||||||
Contribution ideas
|
|
||||||
==================
|
|
||||||
|
|
||||||
Here are some ideas for future developments:
|
|
||||||
- handle XDG
|
|
||||||
- copy queue management (moving copy order)
|
|
||||||
- advanced console interface
|
|
||||||
- notification (XMPP and maybe email) when a long copy is finished
|
|
||||||
- retry for files that were not correctly copied
|
|
||||||
- badly encoded unicode filenames fix
|
|
||||||
- file copy integrity check
|
|
||||||
|
|
||||||
And in an even more distant future:
|
|
||||||
- graphic interface
|
|
||||||
- desktop (Kde, Gnome, XFCE...) integration
|
|
||||||
- distant copy (FTP)
|
|
||||||
- basic server mode, for copying files on network without the need of NFS or
|
|
||||||
other heavy stuff
|
|
||||||
|
|
||||||
|
|
||||||
Credits
|
|
||||||
=======
|
|
||||||
|
|
||||||
A big big thanks to the authors/contributors of...
|
|
||||||
|
|
||||||
* **progressbar**:
|
|
||||||
gcp uses [ProgressBar](https://pypi.python.org/pypi/progressbar), a class
|
|
||||||
coded by Nilton Volpato that allows the textual representation of
|
|
||||||
progression.
|
|
||||||
|
|
||||||
* **GLib**:
|
|
||||||
This heavily used library is used here for the main loop, event catching, and
|
|
||||||
for DBus. Get it at <https://developer.gnome.org/glib/>.
|
|
||||||
|
|
||||||
* **DBus**:
|
|
||||||
This excellent IPC is ut the heart of gcp. Get more information at
|
|
||||||
<https://www.freedesktop.org/wiki/Software/dbus/>.
|
|
||||||
|
|
||||||
* **Python** and its amazing standard library:
|
|
||||||
gcp was coded quickly for my own needs thanks to this excellent and efficient
|
|
||||||
language and its really huge standard library. Python can be download at
|
|
||||||
<https://www.python.org/>.
|
|
||||||
|
|
||||||
If I forgot any credit, please contact me (email below) to fix that.
|
|
||||||
|
|
||||||
Big thanks to contributors and package maintainers.
|
|
||||||
|
|
||||||
|
|
||||||
Contributors
|
|
||||||
============
|
|
||||||
|
|
||||||
* Original author: Jérôme Poisson aka Goffi <goffi@goffi.org> 2010-2011.
|
|
||||||
* Thomas Preud'homme <robotux@celest.fr> 2011: manpage, stat resolution fix.
|
|
||||||
* Jingbei Li aka petronny 2016: conversion to Python3.
|
|
||||||
* Matteo Cypriani <mcy@lm7.fr> 2018: `--fs-fix` option, Python3 fixes.
|
|
||||||
|
|
||||||
|
|
||||||
Contact
|
|
||||||
=======
|
|
||||||
|
|
||||||
Feedback, bug reports, patches, etc. are welcome, either by email or on the
|
|
||||||
repository's issue tracker <https://code.lm7.fr/mcy/gcp/issues>.
|
|
||||||
|
|
||||||
You can also have a look at Goffi's other main project, [Salut à
|
|
||||||
Toi](https://www.salut-a-toi.org/) (SàT), a Jabber/XMPP-based multi-frontend,
|
|
||||||
multipurpose communication tool.
|
|
||||||
|
|
||||||
Don't hesitate to give feedback :)
|
|
|
@ -0,0 +1,485 @@
|
||||||
|
#!python
|
||||||
|
"""Bootstrap distribute installation
|
||||||
|
|
||||||
|
If you want to use setuptools in your package's setup.py, just include this
|
||||||
|
file in the same directory with it, and add this to the top of your setup.py::
|
||||||
|
|
||||||
|
from distribute_setup import use_setuptools
|
||||||
|
use_setuptools()
|
||||||
|
|
||||||
|
If you want to require a specific version of setuptools, set a download
|
||||||
|
mirror, or use an alternate download directory, you can do so by supplying
|
||||||
|
the appropriate options to ``use_setuptools()``.
|
||||||
|
|
||||||
|
This file can also be run as a script to install or upgrade setuptools.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
import fnmatch
|
||||||
|
import tempfile
|
||||||
|
import tarfile
|
||||||
|
from distutils import log
|
||||||
|
|
||||||
|
try:
|
||||||
|
from site import USER_SITE
|
||||||
|
except ImportError:
|
||||||
|
USER_SITE = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
def _python_cmd(*args):
|
||||||
|
args = (sys.executable,) + args
|
||||||
|
return subprocess.call(args) == 0
|
||||||
|
|
||||||
|
except ImportError:
|
||||||
|
# will be used for python 2.3
|
||||||
|
def _python_cmd(*args):
|
||||||
|
args = (sys.executable,) + args
|
||||||
|
# quoting arguments if windows
|
||||||
|
if sys.platform == 'win32':
|
||||||
|
def quote(arg):
|
||||||
|
if ' ' in arg:
|
||||||
|
return '"%s"' % arg
|
||||||
|
return arg
|
||||||
|
args = [quote(arg) for arg in args]
|
||||||
|
return os.spawnl(os.P_WAIT, sys.executable, *args) == 0
|
||||||
|
|
||||||
|
DEFAULT_VERSION = "0.6.14"
|
||||||
|
DEFAULT_URL = "http://pypi.python.org/packages/source/d/distribute/"
|
||||||
|
SETUPTOOLS_FAKED_VERSION = "0.6c11"
|
||||||
|
|
||||||
|
SETUPTOOLS_PKG_INFO = """\
|
||||||
|
Metadata-Version: 1.0
|
||||||
|
Name: setuptools
|
||||||
|
Version: %s
|
||||||
|
Summary: xxxx
|
||||||
|
Home-page: xxx
|
||||||
|
Author: xxx
|
||||||
|
Author-email: xxx
|
||||||
|
License: xxx
|
||||||
|
Description: xxx
|
||||||
|
""" % SETUPTOOLS_FAKED_VERSION
|
||||||
|
|
||||||
|
|
||||||
|
def _install(tarball):
|
||||||
|
# extracting the tarball
|
||||||
|
tmpdir = tempfile.mkdtemp()
|
||||||
|
log.warn('Extracting in %s', tmpdir)
|
||||||
|
old_wd = os.getcwd()
|
||||||
|
try:
|
||||||
|
os.chdir(tmpdir)
|
||||||
|
tar = tarfile.open(tarball)
|
||||||
|
_extractall(tar)
|
||||||
|
tar.close()
|
||||||
|
|
||||||
|
# going in the directory
|
||||||
|
subdir = os.path.join(tmpdir, os.listdir(tmpdir)[0])
|
||||||
|
os.chdir(subdir)
|
||||||
|
log.warn('Now working in %s', subdir)
|
||||||
|
|
||||||
|
# installing
|
||||||
|
log.warn('Installing Distribute')
|
||||||
|
if not _python_cmd('setup.py', 'install'):
|
||||||
|
log.warn('Something went wrong during the installation.')
|
||||||
|
log.warn('See the error message above.')
|
||||||
|
finally:
|
||||||
|
os.chdir(old_wd)
|
||||||
|
|
||||||
|
|
||||||
|
def _build_egg(egg, tarball, to_dir):
|
||||||
|
# extracting the tarball
|
||||||
|
tmpdir = tempfile.mkdtemp()
|
||||||
|
log.warn('Extracting in %s', tmpdir)
|
||||||
|
old_wd = os.getcwd()
|
||||||
|
try:
|
||||||
|
os.chdir(tmpdir)
|
||||||
|
tar = tarfile.open(tarball)
|
||||||
|
_extractall(tar)
|
||||||
|
tar.close()
|
||||||
|
|
||||||
|
# going in the directory
|
||||||
|
subdir = os.path.join(tmpdir, os.listdir(tmpdir)[0])
|
||||||
|
os.chdir(subdir)
|
||||||
|
log.warn('Now working in %s', subdir)
|
||||||
|
|
||||||
|
# building an egg
|
||||||
|
log.warn('Building a Distribute egg in %s', to_dir)
|
||||||
|
_python_cmd('setup.py', '-q', 'bdist_egg', '--dist-dir', to_dir)
|
||||||
|
|
||||||
|
finally:
|
||||||
|
os.chdir(old_wd)
|
||||||
|
# returning the result
|
||||||
|
log.warn(egg)
|
||||||
|
if not os.path.exists(egg):
|
||||||
|
raise IOError('Could not build the egg.')
|
||||||
|
|
||||||
|
|
||||||
|
def _do_download(version, download_base, to_dir, download_delay):
|
||||||
|
egg = os.path.join(to_dir, 'distribute-%s-py%d.%d.egg'
|
||||||
|
% (version, sys.version_info[0], sys.version_info[1]))
|
||||||
|
if not os.path.exists(egg):
|
||||||
|
tarball = download_setuptools(version, download_base,
|
||||||
|
to_dir, download_delay)
|
||||||
|
_build_egg(egg, tarball, to_dir)
|
||||||
|
sys.path.insert(0, egg)
|
||||||
|
import setuptools
|
||||||
|
setuptools.bootstrap_install_from = egg
|
||||||
|
|
||||||
|
|
||||||
|
def use_setuptools(version=DEFAULT_VERSION, download_base=DEFAULT_URL,
|
||||||
|
to_dir=os.curdir, download_delay=15, no_fake=True):
|
||||||
|
# making sure we use the absolute path
|
||||||
|
to_dir = os.path.abspath(to_dir)
|
||||||
|
was_imported = 'pkg_resources' in sys.modules or \
|
||||||
|
'setuptools' in sys.modules
|
||||||
|
try:
|
||||||
|
try:
|
||||||
|
import pkg_resources
|
||||||
|
if not hasattr(pkg_resources, '_distribute'):
|
||||||
|
if not no_fake:
|
||||||
|
_fake_setuptools()
|
||||||
|
raise ImportError
|
||||||
|
except ImportError:
|
||||||
|
return _do_download(version, download_base, to_dir, download_delay)
|
||||||
|
try:
|
||||||
|
pkg_resources.require("distribute>="+version)
|
||||||
|
return
|
||||||
|
except pkg_resources.VersionConflict:
|
||||||
|
e = sys.exc_info()[1]
|
||||||
|
if was_imported:
|
||||||
|
sys.stderr.write(
|
||||||
|
"The required version of distribute (>=%s) is not available,\n"
|
||||||
|
"and can't be installed while this script is running. Please\n"
|
||||||
|
"install a more recent version first, using\n"
|
||||||
|
"'easy_install -U distribute'."
|
||||||
|
"\n\n(Currently using %r)\n" % (version, e.args[0]))
|
||||||
|
sys.exit(2)
|
||||||
|
else:
|
||||||
|
del pkg_resources, sys.modules['pkg_resources'] # reload ok
|
||||||
|
return _do_download(version, download_base, to_dir,
|
||||||
|
download_delay)
|
||||||
|
except pkg_resources.DistributionNotFound:
|
||||||
|
return _do_download(version, download_base, to_dir,
|
||||||
|
download_delay)
|
||||||
|
finally:
|
||||||
|
if not no_fake:
|
||||||
|
_create_fake_setuptools_pkg_info(to_dir)
|
||||||
|
|
||||||
|
def download_setuptools(version=DEFAULT_VERSION, download_base=DEFAULT_URL,
|
||||||
|
to_dir=os.curdir, delay=15):
|
||||||
|
"""Download distribute from a specified location and return its filename
|
||||||
|
|
||||||
|
`version` should be a valid distribute version number that is available
|
||||||
|
as an egg for download under the `download_base` URL (which should end
|
||||||
|
with a '/'). `to_dir` is the directory where the egg will be downloaded.
|
||||||
|
`delay` is the number of seconds to pause before an actual download
|
||||||
|
attempt.
|
||||||
|
"""
|
||||||
|
# making sure we use the absolute path
|
||||||
|
to_dir = os.path.abspath(to_dir)
|
||||||
|
try:
|
||||||
|
from urllib.request import urlopen
|
||||||
|
except ImportError:
|
||||||
|
from urllib2 import urlopen
|
||||||
|
tgz_name = "distribute-%s.tar.gz" % version
|
||||||
|
url = download_base + tgz_name
|
||||||
|
saveto = os.path.join(to_dir, tgz_name)
|
||||||
|
src = dst = None
|
||||||
|
if not os.path.exists(saveto): # Avoid repeated downloads
|
||||||
|
try:
|
||||||
|
log.warn("Downloading %s", url)
|
||||||
|
src = urlopen(url)
|
||||||
|
# Read/write all in one block, so we don't create a corrupt file
|
||||||
|
# if the download is interrupted.
|
||||||
|
data = src.read()
|
||||||
|
dst = open(saveto, "wb")
|
||||||
|
dst.write(data)
|
||||||
|
finally:
|
||||||
|
if src:
|
||||||
|
src.close()
|
||||||
|
if dst:
|
||||||
|
dst.close()
|
||||||
|
return os.path.realpath(saveto)
|
||||||
|
|
||||||
|
def _no_sandbox(function):
|
||||||
|
def __no_sandbox(*args, **kw):
|
||||||
|
try:
|
||||||
|
from setuptools.sandbox import DirectorySandbox
|
||||||
|
if not hasattr(DirectorySandbox, '_old'):
|
||||||
|
def violation(*args):
|
||||||
|
pass
|
||||||
|
DirectorySandbox._old = DirectorySandbox._violation
|
||||||
|
DirectorySandbox._violation = violation
|
||||||
|
patched = True
|
||||||
|
else:
|
||||||
|
patched = False
|
||||||
|
except ImportError:
|
||||||
|
patched = False
|
||||||
|
|
||||||
|
try:
|
||||||
|
return function(*args, **kw)
|
||||||
|
finally:
|
||||||
|
if patched:
|
||||||
|
DirectorySandbox._violation = DirectorySandbox._old
|
||||||
|
del DirectorySandbox._old
|
||||||
|
|
||||||
|
return __no_sandbox
|
||||||
|
|
||||||
|
def _patch_file(path, content):
|
||||||
|
"""Will backup the file then patch it"""
|
||||||
|
existing_content = open(path).read()
|
||||||
|
if existing_content == content:
|
||||||
|
# already patched
|
||||||
|
log.warn('Already patched.')
|
||||||
|
return False
|
||||||
|
log.warn('Patching...')
|
||||||
|
_rename_path(path)
|
||||||
|
f = open(path, 'w')
|
||||||
|
try:
|
||||||
|
f.write(content)
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
return True
|
||||||
|
|
||||||
|
_patch_file = _no_sandbox(_patch_file)
|
||||||
|
|
||||||
|
def _same_content(path, content):
|
||||||
|
return open(path).read() == content
|
||||||
|
|
||||||
|
def _rename_path(path):
|
||||||
|
new_name = path + '.OLD.%s' % time.time()
|
||||||
|
log.warn('Renaming %s into %s', path, new_name)
|
||||||
|
os.rename(path, new_name)
|
||||||
|
return new_name
|
||||||
|
|
||||||
|
def _remove_flat_installation(placeholder):
|
||||||
|
if not os.path.isdir(placeholder):
|
||||||
|
log.warn('Unkown installation at %s', placeholder)
|
||||||
|
return False
|
||||||
|
found = False
|
||||||
|
for file in os.listdir(placeholder):
|
||||||
|
if fnmatch.fnmatch(file, 'setuptools*.egg-info'):
|
||||||
|
found = True
|
||||||
|
break
|
||||||
|
if not found:
|
||||||
|
log.warn('Could not locate setuptools*.egg-info')
|
||||||
|
return
|
||||||
|
|
||||||
|
log.warn('Removing elements out of the way...')
|
||||||
|
pkg_info = os.path.join(placeholder, file)
|
||||||
|
if os.path.isdir(pkg_info):
|
||||||
|
patched = _patch_egg_dir(pkg_info)
|
||||||
|
else:
|
||||||
|
patched = _patch_file(pkg_info, SETUPTOOLS_PKG_INFO)
|
||||||
|
|
||||||
|
if not patched:
|
||||||
|
log.warn('%s already patched.', pkg_info)
|
||||||
|
return False
|
||||||
|
# now let's move the files out of the way
|
||||||
|
for element in ('setuptools', 'pkg_resources.py', 'site.py'):
|
||||||
|
element = os.path.join(placeholder, element)
|
||||||
|
if os.path.exists(element):
|
||||||
|
_rename_path(element)
|
||||||
|
else:
|
||||||
|
log.warn('Could not find the %s element of the '
|
||||||
|
'Setuptools distribution', element)
|
||||||
|
return True
|
||||||
|
|
||||||
|
_remove_flat_installation = _no_sandbox(_remove_flat_installation)
|
||||||
|
|
||||||
|
def _after_install(dist):
|
||||||
|
log.warn('After install bootstrap.')
|
||||||
|
placeholder = dist.get_command_obj('install').install_purelib
|
||||||
|
_create_fake_setuptools_pkg_info(placeholder)
|
||||||
|
|
||||||
|
def _create_fake_setuptools_pkg_info(placeholder):
|
||||||
|
if not placeholder or not os.path.exists(placeholder):
|
||||||
|
log.warn('Could not find the install location')
|
||||||
|
return
|
||||||
|
pyver = '%s.%s' % (sys.version_info[0], sys.version_info[1])
|
||||||
|
setuptools_file = 'setuptools-%s-py%s.egg-info' % \
|
||||||
|
(SETUPTOOLS_FAKED_VERSION, pyver)
|
||||||
|
pkg_info = os.path.join(placeholder, setuptools_file)
|
||||||
|
if os.path.exists(pkg_info):
|
||||||
|
log.warn('%s already exists', pkg_info)
|
||||||
|
return
|
||||||
|
|
||||||
|
log.warn('Creating %s', pkg_info)
|
||||||
|
f = open(pkg_info, 'w')
|
||||||
|
try:
|
||||||
|
f.write(SETUPTOOLS_PKG_INFO)
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
pth_file = os.path.join(placeholder, 'setuptools.pth')
|
||||||
|
log.warn('Creating %s', pth_file)
|
||||||
|
f = open(pth_file, 'w')
|
||||||
|
try:
|
||||||
|
f.write(os.path.join(os.curdir, setuptools_file))
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
_create_fake_setuptools_pkg_info = _no_sandbox(_create_fake_setuptools_pkg_info)
|
||||||
|
|
||||||
|
def _patch_egg_dir(path):
|
||||||
|
# let's check if it's already patched
|
||||||
|
pkg_info = os.path.join(path, 'EGG-INFO', 'PKG-INFO')
|
||||||
|
if os.path.exists(pkg_info):
|
||||||
|
if _same_content(pkg_info, SETUPTOOLS_PKG_INFO):
|
||||||
|
log.warn('%s already patched.', pkg_info)
|
||||||
|
return False
|
||||||
|
_rename_path(path)
|
||||||
|
os.mkdir(path)
|
||||||
|
os.mkdir(os.path.join(path, 'EGG-INFO'))
|
||||||
|
pkg_info = os.path.join(path, 'EGG-INFO', 'PKG-INFO')
|
||||||
|
f = open(pkg_info, 'w')
|
||||||
|
try:
|
||||||
|
f.write(SETUPTOOLS_PKG_INFO)
|
||||||
|
finally:
|
||||||
|
f.close()
|
||||||
|
return True
|
||||||
|
|
||||||
|
_patch_egg_dir = _no_sandbox(_patch_egg_dir)
|
||||||
|
|
||||||
|
def _before_install():
|
||||||
|
log.warn('Before install bootstrap.')
|
||||||
|
_fake_setuptools()
|
||||||
|
|
||||||
|
|
||||||
|
def _under_prefix(location):
|
||||||
|
if 'install' not in sys.argv:
|
||||||
|
return True
|
||||||
|
args = sys.argv[sys.argv.index('install')+1:]
|
||||||
|
for index, arg in enumerate(args):
|
||||||
|
for option in ('--root', '--prefix'):
|
||||||
|
if arg.startswith('%s=' % option):
|
||||||
|
top_dir = arg.split('root=')[-1]
|
||||||
|
return location.startswith(top_dir)
|
||||||
|
elif arg == option:
|
||||||
|
if len(args) > index:
|
||||||
|
top_dir = args[index+1]
|
||||||
|
return location.startswith(top_dir)
|
||||||
|
if arg == '--user' and USER_SITE is not None:
|
||||||
|
return location.startswith(USER_SITE)
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def _fake_setuptools():
|
||||||
|
log.warn('Scanning installed packages')
|
||||||
|
try:
|
||||||
|
import pkg_resources
|
||||||
|
except ImportError:
|
||||||
|
# we're cool
|
||||||
|
log.warn('Setuptools or Distribute does not seem to be installed.')
|
||||||
|
return
|
||||||
|
ws = pkg_resources.working_set
|
||||||
|
try:
|
||||||
|
setuptools_dist = ws.find(pkg_resources.Requirement.parse('setuptools',
|
||||||
|
replacement=False))
|
||||||
|
except TypeError:
|
||||||
|
# old distribute API
|
||||||
|
setuptools_dist = ws.find(pkg_resources.Requirement.parse('setuptools'))
|
||||||
|
|
||||||
|
if setuptools_dist is None:
|
||||||
|
log.warn('No setuptools distribution found')
|
||||||
|
return
|
||||||
|
# detecting if it was already faked
|
||||||
|
setuptools_location = setuptools_dist.location
|
||||||
|
log.warn('Setuptools installation detected at %s', setuptools_location)
|
||||||
|
|
||||||
|
# if --root or --preix was provided, and if
|
||||||
|
# setuptools is not located in them, we don't patch it
|
||||||
|
if not _under_prefix(setuptools_location):
|
||||||
|
log.warn('Not patching, --root or --prefix is installing Distribute'
|
||||||
|
' in another location')
|
||||||
|
return
|
||||||
|
|
||||||
|
# let's see if its an egg
|
||||||
|
if not setuptools_location.endswith('.egg'):
|
||||||
|
log.warn('Non-egg installation')
|
||||||
|
res = _remove_flat_installation(setuptools_location)
|
||||||
|
if not res:
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
log.warn('Egg installation')
|
||||||
|
pkg_info = os.path.join(setuptools_location, 'EGG-INFO', 'PKG-INFO')
|
||||||
|
if (os.path.exists(pkg_info) and
|
||||||
|
_same_content(pkg_info, SETUPTOOLS_PKG_INFO)):
|
||||||
|
log.warn('Already patched.')
|
||||||
|
return
|
||||||
|
log.warn('Patching...')
|
||||||
|
# let's create a fake egg replacing setuptools one
|
||||||
|
res = _patch_egg_dir(setuptools_location)
|
||||||
|
if not res:
|
||||||
|
return
|
||||||
|
log.warn('Patched done.')
|
||||||
|
_relaunch()
|
||||||
|
|
||||||
|
|
||||||
|
def _relaunch():
|
||||||
|
log.warn('Relaunching...')
|
||||||
|
# we have to relaunch the process
|
||||||
|
# pip marker to avoid a relaunch bug
|
||||||
|
if sys.argv[:3] == ['-c', 'install', '--single-version-externally-managed']:
|
||||||
|
sys.argv[0] = 'setup.py'
|
||||||
|
args = [sys.executable] + sys.argv
|
||||||
|
sys.exit(subprocess.call(args))
|
||||||
|
|
||||||
|
|
||||||
|
def _extractall(self, path=".", members=None):
|
||||||
|
"""Extract all members from the archive to the current working
|
||||||
|
directory and set owner, modification time and permissions on
|
||||||
|
directories afterwards. `path' specifies a different directory
|
||||||
|
to extract to. `members' is optional and must be a subset of the
|
||||||
|
list returned by getmembers().
|
||||||
|
"""
|
||||||
|
import copy
|
||||||
|
import operator
|
||||||
|
from tarfile import ExtractError
|
||||||
|
directories = []
|
||||||
|
|
||||||
|
if members is None:
|
||||||
|
members = self
|
||||||
|
|
||||||
|
for tarinfo in members:
|
||||||
|
if tarinfo.isdir():
|
||||||
|
# Extract directories with a safe mode.
|
||||||
|
directories.append(tarinfo)
|
||||||
|
tarinfo = copy.copy(tarinfo)
|
||||||
|
tarinfo.mode = 448 # decimal for oct 0700
|
||||||
|
self.extract(tarinfo, path)
|
||||||
|
|
||||||
|
# Reverse sort directories.
|
||||||
|
if sys.version_info < (2, 4):
|
||||||
|
def sorter(dir1, dir2):
|
||||||
|
return cmp(dir1.name, dir2.name)
|
||||||
|
directories.sort(sorter)
|
||||||
|
directories.reverse()
|
||||||
|
else:
|
||||||
|
directories.sort(key=operator.attrgetter('name'), reverse=True)
|
||||||
|
|
||||||
|
# Set correct owner, mtime and filemode on directories.
|
||||||
|
for tarinfo in directories:
|
||||||
|
dirpath = os.path.join(path, tarinfo.name)
|
||||||
|
try:
|
||||||
|
self.chown(tarinfo, dirpath)
|
||||||
|
self.utime(tarinfo, dirpath)
|
||||||
|
self.chmod(tarinfo, dirpath)
|
||||||
|
except ExtractError:
|
||||||
|
e = sys.exc_info()[1]
|
||||||
|
if self.errorlevel > 1:
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
self._dbg(1, "tarfile: %s" % e)
|
||||||
|
|
||||||
|
|
||||||
|
def main(argv, version=DEFAULT_VERSION):
|
||||||
|
"""Install or upgrade setuptools and EasyInstall"""
|
||||||
|
tarball = download_setuptools()
|
||||||
|
_install(tarball)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main(sys.argv[1:])
|
2
fr.po
2
fr.po
|
@ -246,7 +246,7 @@ msgstr "La barre de progression n'est pas disponible, désactivation"
|
||||||
|
|
||||||
#: gcp:595
|
#: gcp:595
|
||||||
msgid ""
|
msgid ""
|
||||||
"Invalid --preserve value\n"
|
"Invalide --preserve value\n"
|
||||||
"valid values are:"
|
"valid values are:"
|
||||||
msgstr ""
|
msgstr ""
|
||||||
"La valeur de «--preserve» est invalide\n"
|
"La valeur de «--preserve» est invalide\n"
|
||||||
|
|
140
gcp
140
gcp
|
@ -1,4 +1,5 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/python
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
"""
|
"""
|
||||||
gcp: Goffi's CoPier
|
gcp: Goffi's CoPier
|
||||||
|
@ -27,26 +28,26 @@ logging.basicConfig(level=logging.INFO,
|
||||||
###
|
###
|
||||||
|
|
||||||
import gettext
|
import gettext
|
||||||
gettext.install('gcp', "i18n")
|
gettext.install('gcp', "i18n", unicode=True)
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
import os,os.path
|
import os,os.path
|
||||||
from argparse import ArgumentParser
|
from optparse import OptionParser, OptionGroup #To be replaced by argparse ASAP
|
||||||
import pickle
|
import cPickle as pickle
|
||||||
try:
|
try:
|
||||||
from gi.repository import GObject
|
import gobject
|
||||||
#DBus
|
#DBus
|
||||||
import dbus, dbus.glib
|
import dbus, dbus.glib
|
||||||
import dbus.service
|
import dbus.service
|
||||||
import dbus.mainloop.glib
|
import dbus.mainloop.glib
|
||||||
except ImportError as e:
|
except ImportError,e:
|
||||||
error(_("Error during import"))
|
error(_("Error during import"))
|
||||||
error(_("Please check dependecies:"),e)
|
error(_("Please check dependecies:"),e)
|
||||||
exit(1)
|
exit(1)
|
||||||
try:
|
try:
|
||||||
from progressbar import ProgressBar, Percentage, Bar, ETA, FileTransferSpeed
|
from progressbar import ProgressBar, Percentage, Bar, ETA, FileTransferSpeed
|
||||||
pbar_available=True
|
pbar_available=True
|
||||||
except ImportError as e:
|
except ImportError, e:
|
||||||
info (_('ProgressBar not available, please download it at http://pypi.python.org/pypi/progressbar'))
|
info (_('ProgressBar not available, please download it at http://pypi.python.org/pypi/progressbar'))
|
||||||
info (_('Progress bar deactivated\n--\n'))
|
info (_('Progress bar deactivated\n--\n'))
|
||||||
pbar_available=False
|
pbar_available=False
|
||||||
|
@ -103,13 +104,11 @@ class DbusObject(dbus.service.Object):
|
||||||
@return: success (boolean) and error message if any (string)"""
|
@return: success (boolean) and error message if any (string)"""
|
||||||
try:
|
try:
|
||||||
args = pickle.loads(str(args))
|
args = pickle.loads(str(args))
|
||||||
except TypeError as e:
|
except TypeError, pickle.UnpicklingError:
|
||||||
pickle.UnpicklingError = e
|
|
||||||
return (False, _("INTERNAL ERROR: invalid arguments"))
|
return (False, _("INTERNAL ERROR: invalid arguments"))
|
||||||
try:
|
try:
|
||||||
source_dir = pickle.loads(str(source_dir))
|
source_dir = pickle.loads(str(source_dir))
|
||||||
except TypeError as e:
|
except TypeError, pickle.UnpicklingError:
|
||||||
pickle.UnpicklingError = e
|
|
||||||
return (False, _("INTERNAL ERROR: invalid source_dir"))
|
return (False, _("INTERNAL ERROR: invalid source_dir"))
|
||||||
return self._gcp.parseArguments(args, source_dir)
|
return self._gcp.parseArguments(args, source_dir)
|
||||||
|
|
||||||
|
@ -196,7 +195,7 @@ class GCP():
|
||||||
dbus_interface=const_DBUS_INTERFACE)
|
dbus_interface=const_DBUS_INTERFACE)
|
||||||
self._main_instance = False
|
self._main_instance = False
|
||||||
|
|
||||||
except dbus.exceptions.DBusException as e:
|
except dbus.exceptions.DBusException,e:
|
||||||
if e._dbus_error_name=='org.freedesktop.DBus.Error.ServiceUnknown':
|
if e._dbus_error_name=='org.freedesktop.DBus.Error.ServiceUnknown':
|
||||||
self.launchDbusMainInstance()
|
self.launchDbusMainInstance()
|
||||||
debug (_("gcp launched"))
|
debug (_("gcp launched"))
|
||||||
|
@ -232,7 +231,7 @@ class GCP():
|
||||||
#(check freedesktop mounting signals)
|
#(check freedesktop mounting signals)
|
||||||
ret = {}
|
ret = {}
|
||||||
try:
|
try:
|
||||||
with open("/proc/mounts",'r') as mounts:
|
with open("/proc/mounts",'rb') as mounts:
|
||||||
for line in mounts.readlines():
|
for line in mounts.readlines():
|
||||||
fs_spec, fs_file, fs_vfstype, fs_mntops, fs_freq, fs_passno = line.split(' ')
|
fs_spec, fs_file, fs_vfstype, fs_mntops, fs_freq, fs_passno = line.split(' ')
|
||||||
ret[fs_file] = fs_vfstype
|
ret[fs_file] = fs_vfstype
|
||||||
|
@ -244,12 +243,14 @@ class GCP():
|
||||||
"""Add a file to the copy list
|
"""Add a file to the copy list
|
||||||
@param path: absolute path of file
|
@param path: absolute path of file
|
||||||
@param options: options as return by optparse"""
|
@param options: options as return by optparse"""
|
||||||
debug (_("Adding to copy list: %(path)s ==> %(dest_path)s (%(fs_type)s)") % {"path":path, "dest_path":dest_path, "fs_type":self.getFsType(dest_path)} )
|
debug (_("Adding to copy list: %(path)s ==> %(dest_path)s (%(fs_type)s)") % {"path":path.decode('utf-8','replace'),
|
||||||
|
"dest_path":dest_path.decode('utf-8','replace'),
|
||||||
|
"fs_type":self.getFsType(dest_path)} )
|
||||||
try:
|
try:
|
||||||
self.bytes_total+=os.path.getsize(path)
|
self.bytes_total+=os.path.getsize(path)
|
||||||
self.copy_list.insert(0,(path, dest_path, options))
|
self.copy_list.insert(0,(path, dest_path, options))
|
||||||
except OSError as e:
|
except OSError,e:
|
||||||
error(_("Can't copy %(path)s: %(exception)s") % {'path':path, 'exception':e.strerror})
|
error(_("Can't copy %(path)s: %(exception)s") % {'path':path.decode('utf-8','replace'), 'exception':e.strerror})
|
||||||
|
|
||||||
|
|
||||||
def __appendDirToList(self, dirpath, dest_path, options):
|
def __appendDirToList(self, dirpath, dest_path, options):
|
||||||
|
@ -267,19 +268,21 @@ class GCP():
|
||||||
for filename in os.listdir(dirpath):
|
for filename in os.listdir(dirpath):
|
||||||
filepath = os.path.join(dirpath,filename)
|
filepath = os.path.join(dirpath,filename)
|
||||||
if os.path.islink(filepath) and not options.dereference:
|
if os.path.islink(filepath) and not options.dereference:
|
||||||
debug ("Skippink symbolic dir: %s" % filepath)
|
debug ("Skippink symbolic dir: %s" % filepath.decode('utf-8','replace'))
|
||||||
continue
|
continue
|
||||||
if os.path.isdir(filepath):
|
if os.path.isdir(filepath):
|
||||||
full_dest_path = os.path.join(dest_path,filename)
|
full_dest_path = os.path.join(dest_path,filename)
|
||||||
self.__appendDirToList(filepath, full_dest_path, options)
|
self.__appendDirToList(filepath, full_dest_path, options)
|
||||||
else:
|
else:
|
||||||
self.__appendToList(filepath, dest_path, options)
|
self.__appendToList(filepath, dest_path, options)
|
||||||
except OSError as e:
|
except OSError,e:
|
||||||
try:
|
try:
|
||||||
error(_("Can't append %(path)s to copy list: %(exception)s") % {'path':filepath, 'exception':e.strerror})
|
error(_("Can't append %(path)s to copy list: %(exception)s") % {'path':filepath.decode('utf-8','replace'),
|
||||||
|
'exception':e.strerror})
|
||||||
except NameError:
|
except NameError:
|
||||||
#We can't list the dir
|
#We can't list the dir
|
||||||
error(_("Can't access %(dirpath)s: %(exception)s") % {'dirpath':dirpath, 'exception':e.strerror})
|
error(_("Can't access %(dirpath)s: %(exception)s") % {'dirpath':dirpath.decode('utf-8','replace'),
|
||||||
|
'exception':e.strerror})
|
||||||
|
|
||||||
def __checkArgs(self, options, source_dir, args):
|
def __checkArgs(self, options, source_dir, args):
|
||||||
"""Check thats args are files, and add them to copy list
|
"""Check thats args are files, and add them to copy list
|
||||||
|
@ -290,17 +293,17 @@ class GCP():
|
||||||
len_args = len(args)
|
len_args = len(args)
|
||||||
try:
|
try:
|
||||||
dest_path = os.path.normpath(os.path.join(source_dir, args.pop()))
|
dest_path = os.path.normpath(os.path.join(source_dir, args.pop()))
|
||||||
except OSError as e:
|
except OSError,e:
|
||||||
error (_("Invalid dest_path: %s"),e)
|
error (_("Invalid dest_path: %s"),e)
|
||||||
|
|
||||||
for path in args:
|
for path in args:
|
||||||
abspath = os.path.normpath(os.path.join(os.path.expanduser(source_dir), path))
|
abspath = os.path.normpath(os.path.join(os.path.expanduser(source_dir), path))
|
||||||
if not os.path.exists(abspath):
|
if not os.path.exists(abspath):
|
||||||
warning(_("The path given in arg doesn't exist or is not accessible: %s") % abspath)
|
warning(_("The path given in arg doesn't exist or is not accessible: %s") % abspath.decode('utf-8','replace'))
|
||||||
else:
|
else:
|
||||||
if os.path.isdir(abspath):
|
if os.path.isdir(abspath):
|
||||||
if not options.recursive:
|
if not options.recursive:
|
||||||
warning (_('omitting directory "%s"') % abspath)
|
warning (_('omitting directory "%s"') % abspath.decode('utf-8','replace'))
|
||||||
else:
|
else:
|
||||||
_basename=os.path.basename(os.path.normpath(path))
|
_basename=os.path.basename(os.path.normpath(path))
|
||||||
full_dest_path = dest_path if options.directdir else os.path.normpath(os.path.join(dest_path, _basename))
|
full_dest_path = dest_path if options.directdir else os.path.normpath(os.path.join(dest_path, _basename))
|
||||||
|
@ -325,7 +328,7 @@ class GCP():
|
||||||
assert(filename)
|
assert(filename)
|
||||||
dest_file = self.__filename_fix(options.dest_file,options) if options.dest_file else self.__filename_fix(os.path.join(dest_path,filename),options)
|
dest_file = self.__filename_fix(options.dest_file,options) if options.dest_file else self.__filename_fix(os.path.join(dest_path,filename),options)
|
||||||
if os.path.exists(dest_file) and not options.force:
|
if os.path.exists(dest_file) and not options.force:
|
||||||
warning (_("File [%s] already exists, skipping it !") % dest_file)
|
warning (_("File [%s] already exists, skipping it !") % dest_file.decode('utf-8','replace'))
|
||||||
self.journal.copyFailed()
|
self.journal.copyFailed()
|
||||||
self.journal.error("already exists")
|
self.journal.error("already exists")
|
||||||
self.journal.closeFile()
|
self.journal.closeFile()
|
||||||
|
@ -340,10 +343,11 @@ class GCP():
|
||||||
source_fd.close()
|
source_fd.close()
|
||||||
return True
|
return True
|
||||||
|
|
||||||
GObject.io_add_watch(source_fd, GObject.IO_IN,self._copyFile,
|
gobject.io_add_watch(source_fd,gobject.IO_IN,self._copyFile,
|
||||||
(dest_fd, options), priority=GObject.PRIORITY_DEFAULT)
|
(dest_fd, options), priority=gobject.PRIORITY_DEFAULT)
|
||||||
if not self.progress:
|
if not self.progress:
|
||||||
info(_("COPYING %(source)s ==> %(dest)s") % {"source":source_file, "dest":dest_file})
|
info(_("COPYING %(source)s ==> %(dest)s") % {"source":source_file.decode('utf-8','replace'),
|
||||||
|
"dest":dest_file.decode('utf-8','replace')})
|
||||||
return True
|
return True
|
||||||
else:
|
else:
|
||||||
#Nothing left to copy, we quit
|
#Nothing left to copy, we quit
|
||||||
|
@ -436,7 +440,7 @@ class GCP():
|
||||||
os.chown(dest_file, st_file.st_uid, st_file.st_gid)
|
os.chown(dest_file, st_file.st_uid, st_file.st_gid)
|
||||||
elif preserve == 'timestamps':
|
elif preserve == 'timestamps':
|
||||||
os.utime(dest_file, (st_file.st_atime, st_file.st_mtime))
|
os.utime(dest_file, (st_file.st_atime, st_file.st_mtime))
|
||||||
except OSError as e:
|
except OSError,e:
|
||||||
self.journal.error("preserve-"+preserve)
|
self.journal.error("preserve-"+preserve)
|
||||||
|
|
||||||
def __get_string_size(self, size):
|
def __get_string_size(self, size):
|
||||||
|
@ -540,72 +544,72 @@ class GCP():
|
||||||
@return: a tuple (boolean, message) where the boolean is the success of the arguments
|
@return: a tuple (boolean, message) where the boolean is the success of the arguments
|
||||||
validation, and message is the error message to print when necessary"""
|
validation, and message is the error message to print when necessary"""
|
||||||
_usage="""
|
_usage="""
|
||||||
%(prog)s [options] FILE DEST
|
%prog [options] FILE DEST
|
||||||
%(prog)s [options] FILE1 [FILE2 ...] DEST-DIR
|
%prog [options] FILE1 [FILE2 ...] DEST-DIR
|
||||||
|
|
||||||
%(prog)s --help for options list
|
%prog --help for options list
|
||||||
"""
|
"""
|
||||||
for idx in range(len(full_args)):
|
for idx in range(len(full_args)):
|
||||||
full_args[idx] = full_args[idx].encode('utf-8')
|
if isinstance(full_args[idx], unicode):
|
||||||
|
#We don't want unicode as some filenames can be invalid unicode
|
||||||
|
full_args[idx] = full_args[idx].encode('utf-8')
|
||||||
|
|
||||||
parser = ArgumentParser(usage=_usage)
|
parser = OptionParser(usage=_usage,version=ABOUT)
|
||||||
|
|
||||||
parser.add_argument("-r", "--recursive", action="store_true", default=False,
|
parser.add_option("-r", "--recursive", action="store_true", default=False,
|
||||||
help=_("copy directories recursively"))
|
help=_("copy directories recursively"))
|
||||||
|
|
||||||
parser.add_argument("-f", "--force", action="store_true", default=False,
|
parser.add_option("-f", "--force", action="store_true", default=False,
|
||||||
help=_("force overwriting of existing files"))
|
help=_("force overwriting of existing files"))
|
||||||
|
|
||||||
parser.add_argument("--preserve", action="store", default='',
|
parser.add_option("--preserve", action="store", default='mode,ownership,timestamps',
|
||||||
help=_("preserve the specified attributes"))
|
help=_("preserve the specified attributes"))
|
||||||
|
|
||||||
parser.add_argument("-L", "--dereference", action="store_true", default=False,
|
parser.add_option("-L", "--dereference", action="store_true", default=False,
|
||||||
help=_("always follow symbolic links in sources"))
|
help=_("always follow symbolic links in sources"))
|
||||||
|
|
||||||
parser.add_argument("-P", "--no-dereference", action="store_false", dest='dereference',
|
parser.add_option("-P", "--no-dereference", action="store_false", dest='dereference',
|
||||||
help=_("never follow symbolic links in sources"))
|
help=_("never follow symbolic links in sources"))
|
||||||
|
|
||||||
#parser.add_argument("--no-unicode-fix", action="store_false", dest='unicode_fix', default=True,
|
#parser.add_option("--no-unicode-fix", action="store_false", dest='unicode_fix', default=True,
|
||||||
# help=_("don't fix name encoding errors")) #TODO
|
# help=_("don't fix name encoding errors")) #TODO
|
||||||
|
|
||||||
parser.add_argument("--fs-fix", choices = const_FS_FIX, dest='fs_fix', default='auto',
|
parser.add_option("--fs-fix", action="store", dest='fs_fix', default='auto',
|
||||||
help=_("fix filesystem name incompatibily (default: auto)"))
|
help=_("fix filesystem name incompatibily; can be 'auto' (default), 'force' or 'no'"))
|
||||||
|
|
||||||
parser.add_argument("--no-fs-fix", action="store_true", dest='no_fs_fix', default=False,
|
parser.add_option("--no-fs-fix", action="store_true", dest='no_fs_fix', default=False,
|
||||||
help=_("same as --fs-fix=no (overrides --fs-fix)"))
|
help=_("same as --fs-fix=no (overrides --fs-fix)"))
|
||||||
|
|
||||||
parser.add_argument("--no-progress", action="store_false", dest="progress", default=True,
|
parser.add_option("--no-progress", action="store_false", dest="progress", default=True,
|
||||||
help=_("deactivate progress bar"))
|
help=_("deactivate progress bar"))
|
||||||
|
|
||||||
parser.add_argument("-v", "--verbose", action="store_true", default=False,
|
parser.add_option("-v", "--verbose", action="store_true", default=False,
|
||||||
help=_("Show what is currently done"))
|
help=_("Show what is currently done"))
|
||||||
|
|
||||||
parser.add_argument("-V", "--version", action="version", version=ABOUT)
|
group_saving = OptionGroup(parser, "sources saving")
|
||||||
|
|
||||||
group_saving = parser.add_argument_group("sources saving")
|
group_saving.add_option("--sources-save", action="store",
|
||||||
|
|
||||||
group_saving.add_argument("--sources-save", action="store",
|
|
||||||
help=_("Save source arguments"))
|
help=_("Save source arguments"))
|
||||||
|
|
||||||
group_saving.add_argument("--sources-replace", action="store",
|
group_saving.add_option("--sources-replace", action="store",
|
||||||
help=_("Save source arguments and replace memory if it already exists"))
|
help=_("Save source arguments and replace memory if it already exists"))
|
||||||
|
|
||||||
group_saving.add_argument("--sources-load", action="store",
|
group_saving.add_option("--sources-load", action="store",
|
||||||
help=_("Load source arguments"))
|
help=_("Load source arguments"))
|
||||||
|
|
||||||
group_saving.add_argument("--sources-del", action="store",
|
group_saving.add_option("--sources-del", action="store",
|
||||||
help=_("delete saved sources"))
|
help=_("delete saved sources"))
|
||||||
|
|
||||||
group_saving.add_argument("--sources-list", action="store_true", default=False,
|
group_saving.add_option("--sources-list", action="store_true", default=False,
|
||||||
help=_("List names of saved sources"))
|
help=_("List names of saved sources"))
|
||||||
|
|
||||||
group_saving.add_argument("--sources-full-list", action="store_true", default=False,
|
group_saving.add_option("--sources-full-list", action="store_true", default=False,
|
||||||
help=_("List names of saved sources and files in it"))
|
help=_("List names of saved sources and files in it"))
|
||||||
|
|
||||||
parser.add_argument_group(group_saving)
|
parser.add_option_group(group_saving)
|
||||||
|
|
||||||
|
|
||||||
(options, args) = parser.parse_known_args()
|
(options, args) = parser.parse_args(full_args)
|
||||||
options.directdir = False #True only in the special case: we are copying a dir and it doesn't exists
|
options.directdir = False #True only in the special case: we are copying a dir and it doesn't exists
|
||||||
#options check
|
#options check
|
||||||
if options.progress and not pbar_available:
|
if options.progress and not pbar_available:
|
||||||
|
@ -619,19 +623,21 @@ class GCP():
|
||||||
|
|
||||||
if options.no_fs_fix:
|
if options.no_fs_fix:
|
||||||
options.fs_fix = 'no'
|
options.fs_fix = 'no'
|
||||||
|
else:
|
||||||
if len(options.preserve):
|
if not options.fs_fix in const_FS_FIX:
|
||||||
preserve = set(options.preserve.split(','))
|
error (_("Invalid --fs-fix value\nvalid values are:"))
|
||||||
if not preserve.issubset(const_PRESERVE):
|
for value in const_FS_FIX:
|
||||||
error (_("Invalid --preserve value\nvalid values are:"))
|
|
||||||
for value in const_PRESERVE:
|
|
||||||
error('- %s' % value)
|
error('- %s' % value)
|
||||||
exit(1)
|
exit(1)
|
||||||
else:
|
|
||||||
options.preserve = preserve
|
|
||||||
|
|
||||||
|
preserve = set(options.preserve.split(','))
|
||||||
|
if not preserve.issubset(const_PRESERVE):
|
||||||
|
error (_("Invalid --preserve value\nvalid values are:"))
|
||||||
|
for value in const_PRESERVE:
|
||||||
|
error('- %s' % value)
|
||||||
|
exit(1)
|
||||||
else:
|
else:
|
||||||
options.preserve=set()
|
options.preserve = preserve
|
||||||
|
|
||||||
self.__sourcesSaving(options, args)
|
self.__sourcesSaving(options, args)
|
||||||
|
|
||||||
|
@ -660,11 +666,11 @@ class GCP():
|
||||||
if len(args) < 2:
|
if len(args) < 2:
|
||||||
_error_msg = _("Wrong number of arguments")
|
_error_msg = _("Wrong number of arguments")
|
||||||
return (False, _error_msg)
|
return (False, _error_msg)
|
||||||
debug(_("adding args to gcp: %s") % args)
|
debug(_("adding args to gcp: %s") % str(args).decode('utf-8','replace'))
|
||||||
self.__checkArgs(options, source_dir, args)
|
self.__checkArgs(options, source_dir, args)
|
||||||
if not self.__launched:
|
if not self.__launched:
|
||||||
self.journal = Journal()
|
self.journal = Journal()
|
||||||
GObject.idle_add(self.__copyNextFile)
|
gobject.idle_add(self.__copyNextFile)
|
||||||
self.__launched = True
|
self.__launched = True
|
||||||
return (True,'')
|
return (True,'')
|
||||||
|
|
||||||
|
@ -674,7 +680,7 @@ class GCP():
|
||||||
|
|
||||||
def go(self):
|
def go(self):
|
||||||
"""Launch main loop"""
|
"""Launch main loop"""
|
||||||
self.loop = GObject.MainLoop()
|
self.loop = gobject.MainLoop()
|
||||||
try:
|
try:
|
||||||
self.loop.run()
|
self.loop.run()
|
||||||
except KeyboardInterrupt:
|
except KeyboardInterrupt:
|
||||||
|
|
2
gcp.po
2
gcp.po
|
@ -141,7 +141,7 @@ msgstr ""
|
||||||
|
|
||||||
#: gcp:345
|
#: gcp:345
|
||||||
msgid ""
|
msgid ""
|
||||||
"Invalid --preserve value\n"
|
"Invalide --preserve value\n"
|
||||||
"valid values are:"
|
"valid values are:"
|
||||||
msgstr ""
|
msgstr ""
|
||||||
|
|
||||||
|
|
9
setup.py
9
setup.py
|
@ -1,6 +1,9 @@
|
||||||
#!/usr/bin/env python
|
#!/usr/bin/env python
|
||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
from distribute_setup import use_setuptools
|
||||||
|
use_setuptools()
|
||||||
|
|
||||||
from setuptools import setup
|
from setuptools import setup
|
||||||
import sys
|
import sys
|
||||||
from os import path
|
from os import path
|
||||||
|
@ -8,7 +11,7 @@ from os import path
|
||||||
name = 'gcp'
|
name = 'gcp'
|
||||||
|
|
||||||
setup(name=name,
|
setup(name=name,
|
||||||
version='0.1.4',
|
version='0.1.3',
|
||||||
description=u"gcp is an advanced copy tool loosely inspired from cp",
|
description=u"gcp is an advanced copy tool loosely inspired from cp",
|
||||||
long_description=u'gcp is a command-line tool to copy files, loosely inspired from cp, but with high level functionalities such as progress bar, copy continuation on error, journaling to know which files were successfuly copied, name mangling to workaround filesystem limitations (FAT), unique copy queue, copy list managemet, command arguments close to cp',
|
long_description=u'gcp is a command-line tool to copy files, loosely inspired from cp, but with high level functionalities such as progress bar, copy continuation on error, journaling to know which files were successfuly copied, name mangling to workaround filesystem limitations (FAT), unique copy queue, copy list managemet, command arguments close to cp',
|
||||||
author='Goffi (Jérôme Poisson)',
|
author='Goffi (Jérôme Poisson)',
|
||||||
|
@ -21,8 +24,8 @@ setup(name=name,
|
||||||
'Programming Language :: Python',
|
'Programming Language :: Python',
|
||||||
'Topic :: Utilities'
|
'Topic :: Utilities'
|
||||||
],
|
],
|
||||||
data_files=[('share/locale/fr/LC_MESSAGES', ['i18n/fr/LC_MESSAGES/gcp.mo']),
|
data_files=[(path.join(sys.prefix,'share/locale/fr/LC_MESSAGES'), ['i18n/fr/LC_MESSAGES/gcp.mo']),
|
||||||
('share/man/man1', ["gcp.1"]),
|
('share/man/man1', ["gcp.1"]),
|
||||||
('share/doc/%s' % name, ['COPYING','README.md'])],
|
('share/doc/%s' % name, ['COPYING','README'])],
|
||||||
scripts=['gcp'],
|
scripts=['gcp'],
|
||||||
)
|
)
|
||||||
|
|
Loading…
Reference in New Issue