migrate to python3
- updated .gitignore - updated README to markdown format - deprecated distribution - updated optparse to argparse - deprecated all coding functions - fix some syntax errors Now this package should be python3-only.
This commit is contained in:
parent
aad446095c
commit
774f371070
|
@ -1,3 +1,5 @@
|
|||
*.pyc
|
||||
*.pyv
|
||||
*.sw?
|
||||
*.sw*
|
||||
*.tar*
|
||||
*.egg
|
||||
|
|
50
README
50
README
|
@ -1,10 +1,11 @@
|
|||
gcp v0.1.3
|
||||
====
|
||||
(c) Jérôme Poisson aka Goffi 2010, 2011
|
||||
|
||||
gcp (Goffi's cp) is a files copier.
|
||||
|
||||
|
||||
** LICENSE **
|
||||
### LICENSE
|
||||
|
||||
gcp is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
|
@ -21,25 +22,26 @@ along with gcp. If not, see <http://www.gnu.org/licenses/>.
|
|||
|
||||
|
||||
|
||||
** WTF ? **
|
||||
### WTF ?
|
||||
gcp is a file copier, loosely inspired from cp, but with high level functionalities like:
|
||||
- progression indicator
|
||||
- gcp continue copying even when there is an issue: he just skip the file with problem, and go on
|
||||
- journalization: gcp write what he is doing, this allow to know which files were effectively copied
|
||||
- fixing names to be compatible with the target filesystem (e.g. removing incompatible chars like "?" or "*" on vfat)
|
||||
- if you launch a copy when an other is already running, the files are added to the first queue, this avoid your hard drive to move its read/write head all the time
|
||||
- files saving: you can keep track of files you have copied, and re-copy them later (useful when, for example, you always copy some free music to all your friends).
|
||||
- gcp will be approximately option-compatible with cp (approximately because the behaviour is not exactly the same, see below)
|
||||
|
||||
/!\ WARNING /!\
|
||||
- progression indicator
|
||||
- gcp continue copying even when there is an issue: he just skip the file with problem, and go on
|
||||
- journalization: gcp write what he is doing, this allow to know which files were effectively copied
|
||||
- fixing names to be compatible with the target filesystem (e.g. removing incompatible chars like "?" or "*" on vfat)
|
||||
- if you launch a copy when an other is already running, the files are added to the first queue, this avoid your hard drive to move its read/write head all the time
|
||||
- files saving: you can keep track of files you have copied, and re-copy them later (useful when, for example, you always copy some free music to all your friends).
|
||||
- gcp will be approximately option-compatible with cp (approximately because the behaviour is not exactly the same, see below)
|
||||
|
||||
**WARNING**
|
||||
gcp is at an early stage of development, and really experimental: use at your own risks !
|
||||
|
||||
** How to use it ? **
|
||||
### How to use it ?
|
||||
Pretty much like cp (see gcp --help).
|
||||
Please note that the behaviour is not exactly the same as cp, even if gcp want to be option-compatible. Mainly, the destination filenames can be changed (by default, can be deactivated).
|
||||
gcp doesn't implement yet all the options from cp, but it's planed.
|
||||
|
||||
** journalizaion **
|
||||
### Journalizaion
|
||||
The journal is planed to be used by gcp itself, buts remains human-readable. It is located in ~/.gcp/journal
|
||||
|
||||
3 states are used:
|
||||
|
@ -49,7 +51,7 @@ The journal is planed to be used by gcp itself, buts remains human-readable. It
|
|||
|
||||
after the state, a list of things which went wront are show, separated by ", "
|
||||
|
||||
** What's next ? **
|
||||
### What's next ?
|
||||
|
||||
Several improvment are already planed
|
||||
- copy queue management (moving copy order)
|
||||
|
@ -65,32 +67,32 @@ Several improvment are already planed
|
|||
- distant copy (ftp)
|
||||
- basic server mode, for copying files on network without the need of nfs or other heavy stuff
|
||||
|
||||
** Credits **
|
||||
### Credits
|
||||
|
||||
A big big thank to the authors/contributors of...
|
||||
|
||||
progressbar:
|
||||
gcp use ProgressBar (http://pypi.python.org/pypi/progressbar/2.2), a class coded by Nilton Volpato which allow the textual representation of progression.
|
||||
* progressbar:
|
||||
gcp use ProgressBar (http://pypi.python.org/pypi/progressbar/2.2), a class coded by Nilton Volpato which allow the textual representation of progression.
|
||||
|
||||
GLib:
|
||||
This heavily used library is used here for the main loop, event catching, and for DBus. Get it at http://library.gnome.org/devel/glib/
|
||||
* GLib:
|
||||
This heavily used library is used here for the main loop, event catching, and for DBus. Get it at http://library.gnome.org/devel/glib/
|
||||
|
||||
DBus:
|
||||
This excellent IPC is in the heart of gcp. Get more information at www.freedesktop.org/wiki/Software/dbus
|
||||
* DBus:
|
||||
This excellent IPC is in the heart of gcp. Get more information at www.freedesktop.org/wiki/Software/dbus
|
||||
|
||||
python and its amazing standard library:
|
||||
gcp was coded quickly for my own need thanks to this excellent and efficient language and its really huge standard library. Python can be download at www.python.org
|
||||
* python and its amazing standard library:
|
||||
gcp was coded quickly for my own need thanks to this excellent and efficient language and its really huge standard library. Python can be download at www.python.org
|
||||
|
||||
If I forgot any credit, please contact me (mail below) to fix it.
|
||||
|
||||
Big thanks to contributors and package mainteners
|
||||
|
||||
** Contributions **
|
||||
### Contributions
|
||||
2011: Thomas Preud'homme <robotux@celest.fr>: manpage, stat resolution fix
|
||||
|
||||
|
||||
|
||||
** Contact **
|
||||
### Contact
|
||||
|
||||
You can contact me at goffi@goffi.org .
|
||||
You'll find the latest version on my ftp: ftp://ftp.goffi.org/gcp, or check the wiki ( http://wiki.goffi.org/wiki/Gcp )
|
||||
|
|
|
@ -1,485 +0,0 @@
|
|||
#!python
|
||||
"""Bootstrap distribute installation
|
||||
|
||||
If you want to use setuptools in your package's setup.py, just include this
|
||||
file in the same directory with it, and add this to the top of your setup.py::
|
||||
|
||||
from distribute_setup import use_setuptools
|
||||
use_setuptools()
|
||||
|
||||
If you want to require a specific version of setuptools, set a download
|
||||
mirror, or use an alternate download directory, you can do so by supplying
|
||||
the appropriate options to ``use_setuptools()``.
|
||||
|
||||
This file can also be run as a script to install or upgrade setuptools.
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import fnmatch
|
||||
import tempfile
|
||||
import tarfile
|
||||
from distutils import log
|
||||
|
||||
try:
|
||||
from site import USER_SITE
|
||||
except ImportError:
|
||||
USER_SITE = None
|
||||
|
||||
try:
|
||||
import subprocess
|
||||
|
||||
def _python_cmd(*args):
|
||||
args = (sys.executable,) + args
|
||||
return subprocess.call(args) == 0
|
||||
|
||||
except ImportError:
|
||||
# will be used for python 2.3
|
||||
def _python_cmd(*args):
|
||||
args = (sys.executable,) + args
|
||||
# quoting arguments if windows
|
||||
if sys.platform == 'win32':
|
||||
def quote(arg):
|
||||
if ' ' in arg:
|
||||
return '"%s"' % arg
|
||||
return arg
|
||||
args = [quote(arg) for arg in args]
|
||||
return os.spawnl(os.P_WAIT, sys.executable, *args) == 0
|
||||
|
||||
DEFAULT_VERSION = "0.6.14"
|
||||
DEFAULT_URL = "http://pypi.python.org/packages/source/d/distribute/"
|
||||
SETUPTOOLS_FAKED_VERSION = "0.6c11"
|
||||
|
||||
SETUPTOOLS_PKG_INFO = """\
|
||||
Metadata-Version: 1.0
|
||||
Name: setuptools
|
||||
Version: %s
|
||||
Summary: xxxx
|
||||
Home-page: xxx
|
||||
Author: xxx
|
||||
Author-email: xxx
|
||||
License: xxx
|
||||
Description: xxx
|
||||
""" % SETUPTOOLS_FAKED_VERSION
|
||||
|
||||
|
||||
def _install(tarball):
|
||||
# extracting the tarball
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
log.warn('Extracting in %s', tmpdir)
|
||||
old_wd = os.getcwd()
|
||||
try:
|
||||
os.chdir(tmpdir)
|
||||
tar = tarfile.open(tarball)
|
||||
_extractall(tar)
|
||||
tar.close()
|
||||
|
||||
# going in the directory
|
||||
subdir = os.path.join(tmpdir, os.listdir(tmpdir)[0])
|
||||
os.chdir(subdir)
|
||||
log.warn('Now working in %s', subdir)
|
||||
|
||||
# installing
|
||||
log.warn('Installing Distribute')
|
||||
if not _python_cmd('setup.py', 'install'):
|
||||
log.warn('Something went wrong during the installation.')
|
||||
log.warn('See the error message above.')
|
||||
finally:
|
||||
os.chdir(old_wd)
|
||||
|
||||
|
||||
def _build_egg(egg, tarball, to_dir):
|
||||
# extracting the tarball
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
log.warn('Extracting in %s', tmpdir)
|
||||
old_wd = os.getcwd()
|
||||
try:
|
||||
os.chdir(tmpdir)
|
||||
tar = tarfile.open(tarball)
|
||||
_extractall(tar)
|
||||
tar.close()
|
||||
|
||||
# going in the directory
|
||||
subdir = os.path.join(tmpdir, os.listdir(tmpdir)[0])
|
||||
os.chdir(subdir)
|
||||
log.warn('Now working in %s', subdir)
|
||||
|
||||
# building an egg
|
||||
log.warn('Building a Distribute egg in %s', to_dir)
|
||||
_python_cmd('setup.py', '-q', 'bdist_egg', '--dist-dir', to_dir)
|
||||
|
||||
finally:
|
||||
os.chdir(old_wd)
|
||||
# returning the result
|
||||
log.warn(egg)
|
||||
if not os.path.exists(egg):
|
||||
raise IOError('Could not build the egg.')
|
||||
|
||||
|
||||
def _do_download(version, download_base, to_dir, download_delay):
|
||||
egg = os.path.join(to_dir, 'distribute-%s-py%d.%d.egg'
|
||||
% (version, sys.version_info[0], sys.version_info[1]))
|
||||
if not os.path.exists(egg):
|
||||
tarball = download_setuptools(version, download_base,
|
||||
to_dir, download_delay)
|
||||
_build_egg(egg, tarball, to_dir)
|
||||
sys.path.insert(0, egg)
|
||||
import setuptools
|
||||
setuptools.bootstrap_install_from = egg
|
||||
|
||||
|
||||
def use_setuptools(version=DEFAULT_VERSION, download_base=DEFAULT_URL,
|
||||
to_dir=os.curdir, download_delay=15, no_fake=True):
|
||||
# making sure we use the absolute path
|
||||
to_dir = os.path.abspath(to_dir)
|
||||
was_imported = 'pkg_resources' in sys.modules or \
|
||||
'setuptools' in sys.modules
|
||||
try:
|
||||
try:
|
||||
import pkg_resources
|
||||
if not hasattr(pkg_resources, '_distribute'):
|
||||
if not no_fake:
|
||||
_fake_setuptools()
|
||||
raise ImportError
|
||||
except ImportError:
|
||||
return _do_download(version, download_base, to_dir, download_delay)
|
||||
try:
|
||||
pkg_resources.require("distribute>="+version)
|
||||
return
|
||||
except pkg_resources.VersionConflict:
|
||||
e = sys.exc_info()[1]
|
||||
if was_imported:
|
||||
sys.stderr.write(
|
||||
"The required version of distribute (>=%s) is not available,\n"
|
||||
"and can't be installed while this script is running. Please\n"
|
||||
"install a more recent version first, using\n"
|
||||
"'easy_install -U distribute'."
|
||||
"\n\n(Currently using %r)\n" % (version, e.args[0]))
|
||||
sys.exit(2)
|
||||
else:
|
||||
del pkg_resources, sys.modules['pkg_resources'] # reload ok
|
||||
return _do_download(version, download_base, to_dir,
|
||||
download_delay)
|
||||
except pkg_resources.DistributionNotFound:
|
||||
return _do_download(version, download_base, to_dir,
|
||||
download_delay)
|
||||
finally:
|
||||
if not no_fake:
|
||||
_create_fake_setuptools_pkg_info(to_dir)
|
||||
|
||||
def download_setuptools(version=DEFAULT_VERSION, download_base=DEFAULT_URL,
|
||||
to_dir=os.curdir, delay=15):
|
||||
"""Download distribute from a specified location and return its filename
|
||||
|
||||
`version` should be a valid distribute version number that is available
|
||||
as an egg for download under the `download_base` URL (which should end
|
||||
with a '/'). `to_dir` is the directory where the egg will be downloaded.
|
||||
`delay` is the number of seconds to pause before an actual download
|
||||
attempt.
|
||||
"""
|
||||
# making sure we use the absolute path
|
||||
to_dir = os.path.abspath(to_dir)
|
||||
try:
|
||||
from urllib.request import urlopen
|
||||
except ImportError:
|
||||
from urllib2 import urlopen
|
||||
tgz_name = "distribute-%s.tar.gz" % version
|
||||
url = download_base + tgz_name
|
||||
saveto = os.path.join(to_dir, tgz_name)
|
||||
src = dst = None
|
||||
if not os.path.exists(saveto): # Avoid repeated downloads
|
||||
try:
|
||||
log.warn("Downloading %s", url)
|
||||
src = urlopen(url)
|
||||
# Read/write all in one block, so we don't create a corrupt file
|
||||
# if the download is interrupted.
|
||||
data = src.read()
|
||||
dst = open(saveto, "wb")
|
||||
dst.write(data)
|
||||
finally:
|
||||
if src:
|
||||
src.close()
|
||||
if dst:
|
||||
dst.close()
|
||||
return os.path.realpath(saveto)
|
||||
|
||||
def _no_sandbox(function):
|
||||
def __no_sandbox(*args, **kw):
|
||||
try:
|
||||
from setuptools.sandbox import DirectorySandbox
|
||||
if not hasattr(DirectorySandbox, '_old'):
|
||||
def violation(*args):
|
||||
pass
|
||||
DirectorySandbox._old = DirectorySandbox._violation
|
||||
DirectorySandbox._violation = violation
|
||||
patched = True
|
||||
else:
|
||||
patched = False
|
||||
except ImportError:
|
||||
patched = False
|
||||
|
||||
try:
|
||||
return function(*args, **kw)
|
||||
finally:
|
||||
if patched:
|
||||
DirectorySandbox._violation = DirectorySandbox._old
|
||||
del DirectorySandbox._old
|
||||
|
||||
return __no_sandbox
|
||||
|
||||
def _patch_file(path, content):
|
||||
"""Will backup the file then patch it"""
|
||||
existing_content = open(path).read()
|
||||
if existing_content == content:
|
||||
# already patched
|
||||
log.warn('Already patched.')
|
||||
return False
|
||||
log.warn('Patching...')
|
||||
_rename_path(path)
|
||||
f = open(path, 'w')
|
||||
try:
|
||||
f.write(content)
|
||||
finally:
|
||||
f.close()
|
||||
return True
|
||||
|
||||
_patch_file = _no_sandbox(_patch_file)
|
||||
|
||||
def _same_content(path, content):
|
||||
return open(path).read() == content
|
||||
|
||||
def _rename_path(path):
|
||||
new_name = path + '.OLD.%s' % time.time()
|
||||
log.warn('Renaming %s into %s', path, new_name)
|
||||
os.rename(path, new_name)
|
||||
return new_name
|
||||
|
||||
def _remove_flat_installation(placeholder):
|
||||
if not os.path.isdir(placeholder):
|
||||
log.warn('Unkown installation at %s', placeholder)
|
||||
return False
|
||||
found = False
|
||||
for file in os.listdir(placeholder):
|
||||
if fnmatch.fnmatch(file, 'setuptools*.egg-info'):
|
||||
found = True
|
||||
break
|
||||
if not found:
|
||||
log.warn('Could not locate setuptools*.egg-info')
|
||||
return
|
||||
|
||||
log.warn('Removing elements out of the way...')
|
||||
pkg_info = os.path.join(placeholder, file)
|
||||
if os.path.isdir(pkg_info):
|
||||
patched = _patch_egg_dir(pkg_info)
|
||||
else:
|
||||
patched = _patch_file(pkg_info, SETUPTOOLS_PKG_INFO)
|
||||
|
||||
if not patched:
|
||||
log.warn('%s already patched.', pkg_info)
|
||||
return False
|
||||
# now let's move the files out of the way
|
||||
for element in ('setuptools', 'pkg_resources.py', 'site.py'):
|
||||
element = os.path.join(placeholder, element)
|
||||
if os.path.exists(element):
|
||||
_rename_path(element)
|
||||
else:
|
||||
log.warn('Could not find the %s element of the '
|
||||
'Setuptools distribution', element)
|
||||
return True
|
||||
|
||||
_remove_flat_installation = _no_sandbox(_remove_flat_installation)
|
||||
|
||||
def _after_install(dist):
|
||||
log.warn('After install bootstrap.')
|
||||
placeholder = dist.get_command_obj('install').install_purelib
|
||||
_create_fake_setuptools_pkg_info(placeholder)
|
||||
|
||||
def _create_fake_setuptools_pkg_info(placeholder):
|
||||
if not placeholder or not os.path.exists(placeholder):
|
||||
log.warn('Could not find the install location')
|
||||
return
|
||||
pyver = '%s.%s' % (sys.version_info[0], sys.version_info[1])
|
||||
setuptools_file = 'setuptools-%s-py%s.egg-info' % \
|
||||
(SETUPTOOLS_FAKED_VERSION, pyver)
|
||||
pkg_info = os.path.join(placeholder, setuptools_file)
|
||||
if os.path.exists(pkg_info):
|
||||
log.warn('%s already exists', pkg_info)
|
||||
return
|
||||
|
||||
log.warn('Creating %s', pkg_info)
|
||||
f = open(pkg_info, 'w')
|
||||
try:
|
||||
f.write(SETUPTOOLS_PKG_INFO)
|
||||
finally:
|
||||
f.close()
|
||||
|
||||
pth_file = os.path.join(placeholder, 'setuptools.pth')
|
||||
log.warn('Creating %s', pth_file)
|
||||
f = open(pth_file, 'w')
|
||||
try:
|
||||
f.write(os.path.join(os.curdir, setuptools_file))
|
||||
finally:
|
||||
f.close()
|
||||
|
||||
_create_fake_setuptools_pkg_info = _no_sandbox(_create_fake_setuptools_pkg_info)
|
||||
|
||||
def _patch_egg_dir(path):
|
||||
# let's check if it's already patched
|
||||
pkg_info = os.path.join(path, 'EGG-INFO', 'PKG-INFO')
|
||||
if os.path.exists(pkg_info):
|
||||
if _same_content(pkg_info, SETUPTOOLS_PKG_INFO):
|
||||
log.warn('%s already patched.', pkg_info)
|
||||
return False
|
||||
_rename_path(path)
|
||||
os.mkdir(path)
|
||||
os.mkdir(os.path.join(path, 'EGG-INFO'))
|
||||
pkg_info = os.path.join(path, 'EGG-INFO', 'PKG-INFO')
|
||||
f = open(pkg_info, 'w')
|
||||
try:
|
||||
f.write(SETUPTOOLS_PKG_INFO)
|
||||
finally:
|
||||
f.close()
|
||||
return True
|
||||
|
||||
_patch_egg_dir = _no_sandbox(_patch_egg_dir)
|
||||
|
||||
def _before_install():
|
||||
log.warn('Before install bootstrap.')
|
||||
_fake_setuptools()
|
||||
|
||||
|
||||
def _under_prefix(location):
|
||||
if 'install' not in sys.argv:
|
||||
return True
|
||||
args = sys.argv[sys.argv.index('install')+1:]
|
||||
for index, arg in enumerate(args):
|
||||
for option in ('--root', '--prefix'):
|
||||
if arg.startswith('%s=' % option):
|
||||
top_dir = arg.split('root=')[-1]
|
||||
return location.startswith(top_dir)
|
||||
elif arg == option:
|
||||
if len(args) > index:
|
||||
top_dir = args[index+1]
|
||||
return location.startswith(top_dir)
|
||||
if arg == '--user' and USER_SITE is not None:
|
||||
return location.startswith(USER_SITE)
|
||||
return True
|
||||
|
||||
|
||||
def _fake_setuptools():
|
||||
log.warn('Scanning installed packages')
|
||||
try:
|
||||
import pkg_resources
|
||||
except ImportError:
|
||||
# we're cool
|
||||
log.warn('Setuptools or Distribute does not seem to be installed.')
|
||||
return
|
||||
ws = pkg_resources.working_set
|
||||
try:
|
||||
setuptools_dist = ws.find(pkg_resources.Requirement.parse('setuptools',
|
||||
replacement=False))
|
||||
except TypeError:
|
||||
# old distribute API
|
||||
setuptools_dist = ws.find(pkg_resources.Requirement.parse('setuptools'))
|
||||
|
||||
if setuptools_dist is None:
|
||||
log.warn('No setuptools distribution found')
|
||||
return
|
||||
# detecting if it was already faked
|
||||
setuptools_location = setuptools_dist.location
|
||||
log.warn('Setuptools installation detected at %s', setuptools_location)
|
||||
|
||||
# if --root or --preix was provided, and if
|
||||
# setuptools is not located in them, we don't patch it
|
||||
if not _under_prefix(setuptools_location):
|
||||
log.warn('Not patching, --root or --prefix is installing Distribute'
|
||||
' in another location')
|
||||
return
|
||||
|
||||
# let's see if its an egg
|
||||
if not setuptools_location.endswith('.egg'):
|
||||
log.warn('Non-egg installation')
|
||||
res = _remove_flat_installation(setuptools_location)
|
||||
if not res:
|
||||
return
|
||||
else:
|
||||
log.warn('Egg installation')
|
||||
pkg_info = os.path.join(setuptools_location, 'EGG-INFO', 'PKG-INFO')
|
||||
if (os.path.exists(pkg_info) and
|
||||
_same_content(pkg_info, SETUPTOOLS_PKG_INFO)):
|
||||
log.warn('Already patched.')
|
||||
return
|
||||
log.warn('Patching...')
|
||||
# let's create a fake egg replacing setuptools one
|
||||
res = _patch_egg_dir(setuptools_location)
|
||||
if not res:
|
||||
return
|
||||
log.warn('Patched done.')
|
||||
_relaunch()
|
||||
|
||||
|
||||
def _relaunch():
|
||||
log.warn('Relaunching...')
|
||||
# we have to relaunch the process
|
||||
# pip marker to avoid a relaunch bug
|
||||
if sys.argv[:3] == ['-c', 'install', '--single-version-externally-managed']:
|
||||
sys.argv[0] = 'setup.py'
|
||||
args = [sys.executable] + sys.argv
|
||||
sys.exit(subprocess.call(args))
|
||||
|
||||
|
||||
def _extractall(self, path=".", members=None):
|
||||
"""Extract all members from the archive to the current working
|
||||
directory and set owner, modification time and permissions on
|
||||
directories afterwards. `path' specifies a different directory
|
||||
to extract to. `members' is optional and must be a subset of the
|
||||
list returned by getmembers().
|
||||
"""
|
||||
import copy
|
||||
import operator
|
||||
from tarfile import ExtractError
|
||||
directories = []
|
||||
|
||||
if members is None:
|
||||
members = self
|
||||
|
||||
for tarinfo in members:
|
||||
if tarinfo.isdir():
|
||||
# Extract directories with a safe mode.
|
||||
directories.append(tarinfo)
|
||||
tarinfo = copy.copy(tarinfo)
|
||||
tarinfo.mode = 448 # decimal for oct 0700
|
||||
self.extract(tarinfo, path)
|
||||
|
||||
# Reverse sort directories.
|
||||
if sys.version_info < (2, 4):
|
||||
def sorter(dir1, dir2):
|
||||
return cmp(dir1.name, dir2.name)
|
||||
directories.sort(sorter)
|
||||
directories.reverse()
|
||||
else:
|
||||
directories.sort(key=operator.attrgetter('name'), reverse=True)
|
||||
|
||||
# Set correct owner, mtime and filemode on directories.
|
||||
for tarinfo in directories:
|
||||
dirpath = os.path.join(path, tarinfo.name)
|
||||
try:
|
||||
self.chown(tarinfo, dirpath)
|
||||
self.utime(tarinfo, dirpath)
|
||||
self.chmod(tarinfo, dirpath)
|
||||
except ExtractError:
|
||||
e = sys.exc_info()[1]
|
||||
if self.errorlevel > 1:
|
||||
raise
|
||||
else:
|
||||
self._dbg(1, "tarfile: %s" % e)
|
||||
|
||||
|
||||
def main(argv, version=DEFAULT_VERSION):
|
||||
"""Install or upgrade setuptools and EasyInstall"""
|
||||
tarball = download_setuptools()
|
||||
_install(tarball)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main(sys.argv[1:])
|
103
gcp
103
gcp
|
@ -28,26 +28,26 @@ logging.basicConfig(level=logging.INFO,
|
|||
###
|
||||
|
||||
import gettext
|
||||
gettext.install('gcp', "i18n", unicode=True)
|
||||
gettext.install('gcp', "i18n")
|
||||
|
||||
import sys
|
||||
import os,os.path
|
||||
from optparse import OptionParser, OptionGroup #To be replaced by argparse ASAP
|
||||
import cPickle as pickle
|
||||
from argparse import ArgumentParser
|
||||
import pickle
|
||||
try:
|
||||
import gobject
|
||||
#DBus
|
||||
import dbus, dbus.glib
|
||||
import dbus.service
|
||||
import dbus.mainloop.glib
|
||||
except ImportError,e:
|
||||
except ImportError as e:
|
||||
error(_("Error during import"))
|
||||
error(_("Please check dependecies:"),e)
|
||||
exit(1)
|
||||
try:
|
||||
from progressbar import ProgressBar, Percentage, Bar, ETA, FileTransferSpeed
|
||||
pbar_available=True
|
||||
except ImportError, e:
|
||||
except ImportError as e:
|
||||
info (_('ProgressBar not available, please download it at http://pypi.python.org/pypi/progressbar'))
|
||||
info (_('Progress bar deactivated\n--\n'))
|
||||
pbar_available=False
|
||||
|
@ -103,11 +103,13 @@ class DbusObject(dbus.service.Object):
|
|||
@return: success (boolean) and error message if any (string)"""
|
||||
try:
|
||||
args = pickle.loads(str(args))
|
||||
except TypeError, pickle.UnpicklingError:
|
||||
except TypeError as e:
|
||||
pickle.UnpicklingError = e
|
||||
return (False, _("INTERNAL ERROR: invalid arguments"))
|
||||
try:
|
||||
source_dir = pickle.loads(str(source_dir))
|
||||
except TypeError, pickle.UnpicklingError:
|
||||
except TypeError as e:
|
||||
pickle.UnpicklingError = e
|
||||
return (False, _("INTERNAL ERROR: invalid source_dir"))
|
||||
return self._gcp.parseArguments(args, source_dir)
|
||||
|
||||
|
@ -194,7 +196,7 @@ class GCP():
|
|||
dbus_interface=const_DBUS_INTERFACE)
|
||||
self._main_instance = False
|
||||
|
||||
except dbus.exceptions.DBusException,e:
|
||||
except dbus.exceptions.DBusException as e:
|
||||
if e._dbus_error_name=='org.freedesktop.DBus.Error.ServiceUnknown':
|
||||
self.launchDbusMainInstance()
|
||||
debug (_("gcp launched"))
|
||||
|
@ -230,7 +232,7 @@ class GCP():
|
|||
#(check freedesktop mounting signals)
|
||||
ret = {}
|
||||
try:
|
||||
with open("/proc/mounts",'rb') as mounts:
|
||||
with open("/proc/mounts",'r') as mounts:
|
||||
for line in mounts.readlines():
|
||||
fs_spec, fs_file, fs_vfstype, fs_mntops, fs_freq, fs_passno = line.split(' ')
|
||||
ret[fs_file] = fs_vfstype
|
||||
|
@ -242,14 +244,12 @@ class GCP():
|
|||
"""Add a file to the copy list
|
||||
@param path: absolute path of file
|
||||
@param options: options as return by optparse"""
|
||||
debug (_("Adding to copy list: %(path)s ==> %(dest_path)s (%(fs_type)s)") % {"path":path.decode('utf-8','replace'),
|
||||
"dest_path":dest_path.decode('utf-8','replace'),
|
||||
"fs_type":self.getFsType(dest_path)} )
|
||||
debug (_("Adding to copy list: %(path)s ==> %(dest_path)s (%(fs_type)s)") % {"path":path, "dest_path":dest_path, "fs_type":self.getFsType(dest_path)} )
|
||||
try:
|
||||
self.bytes_total+=os.path.getsize(path)
|
||||
self.copy_list.insert(0,(path, dest_path, options))
|
||||
except OSError,e:
|
||||
error(_("Can't copy %(path)s: %(exception)s") % {'path':path.decode('utf-8','replace'), 'exception':e.strerror})
|
||||
except OSError as e:
|
||||
error(_("Can't copy %(path)s: %(exception)s") % {'path':path, 'exception':e.strerror})
|
||||
|
||||
|
||||
def __appendDirToList(self, dirpath, dest_path, options):
|
||||
|
@ -267,21 +267,19 @@ class GCP():
|
|||
for filename in os.listdir(dirpath):
|
||||
filepath = os.path.join(dirpath,filename)
|
||||
if os.path.islink(filepath) and not options.dereference:
|
||||
debug ("Skippink symbolic dir: %s" % filepath.decode('utf-8','replace'))
|
||||
debug ("Skippink symbolic dir: %s" % filepath)
|
||||
continue
|
||||
if os.path.isdir(filepath):
|
||||
full_dest_path = os.path.join(dest_path,filename)
|
||||
self.__appendDirToList(filepath, full_dest_path, options)
|
||||
else:
|
||||
self.__appendToList(filepath, dest_path, options)
|
||||
except OSError,e:
|
||||
except OSError as e:
|
||||
try:
|
||||
error(_("Can't append %(path)s to copy list: %(exception)s") % {'path':filepath.decode('utf-8','replace'),
|
||||
'exception':e.strerror})
|
||||
error(_("Can't append %(path)s to copy list: %(exception)s") % {'path':filepath, 'exception':e.strerror})
|
||||
except NameError:
|
||||
#We can't list the dir
|
||||
error(_("Can't access %(dirpath)s: %(exception)s") % {'dirpath':dirpath.decode('utf-8','replace'),
|
||||
'exception':e.strerror})
|
||||
error(_("Can't access %(dirpath)s: %(exception)s") % {'dirpath':dirpath, 'exception':e.strerror})
|
||||
|
||||
def __checkArgs(self, options, source_dir, args):
|
||||
"""Check thats args are files, and add them to copy list
|
||||
|
@ -292,17 +290,17 @@ class GCP():
|
|||
len_args = len(args)
|
||||
try:
|
||||
dest_path = os.path.normpath(os.path.join(source_dir, args.pop()))
|
||||
except OSError,e:
|
||||
except OSError as e:
|
||||
error (_("Invalid dest_path: %s"),e)
|
||||
|
||||
for path in args:
|
||||
abspath = os.path.normpath(os.path.join(os.path.expanduser(source_dir), path))
|
||||
if not os.path.exists(abspath):
|
||||
warning(_("The path given in arg doesn't exist or is not accessible: %s") % abspath.decode('utf-8','replace'))
|
||||
warning(_("The path given in arg doesn't exist or is not accessible: %s") % abspath)
|
||||
else:
|
||||
if os.path.isdir(abspath):
|
||||
if not options.recursive:
|
||||
warning (_('omitting directory "%s"') % abspath.decode('utf-8','replace'))
|
||||
warning (_('omitting directory "%s"') % abspath)
|
||||
else:
|
||||
_basename=os.path.basename(os.path.normpath(path))
|
||||
full_dest_path = dest_path if options.directdir else os.path.normpath(os.path.join(dest_path, _basename))
|
||||
|
@ -327,7 +325,7 @@ class GCP():
|
|||
assert(filename)
|
||||
dest_file = self.__filename_fix(options.dest_file,options) if options.dest_file else self.__filename_fix(os.path.join(dest_path,filename),options)
|
||||
if os.path.exists(dest_file) and not options.force:
|
||||
warning (_("File [%s] already exists, skipping it !") % dest_file.decode('utf-8','replace'))
|
||||
warning (_("File [%s] already exists, skipping it !") % dest_file)
|
||||
self.journal.copyFailed()
|
||||
self.journal.error("already exists")
|
||||
self.journal.closeFile()
|
||||
|
@ -345,8 +343,7 @@ class GCP():
|
|||
gobject.io_add_watch(source_fd,gobject.IO_IN,self._copyFile,
|
||||
(dest_fd, options), priority=gobject.PRIORITY_DEFAULT)
|
||||
if not self.progress:
|
||||
info(_("COPYING %(source)s ==> %(dest)s") % {"source":source_file.decode('utf-8','replace'),
|
||||
"dest":dest_file.decode('utf-8','replace')})
|
||||
info(_("COPYING %(source)s ==> %(dest)s") % {"source":source_file, "dest":dest_file})
|
||||
return True
|
||||
else:
|
||||
#Nothing left to copy, we quit
|
||||
|
@ -439,7 +436,7 @@ class GCP():
|
|||
os.chown(dest_file, st_file.st_uid, st_file.st_gid)
|
||||
elif preserve == 'timestamps':
|
||||
os.utime(dest_file, (st_file.st_atime, st_file.st_mtime))
|
||||
except OSError,e:
|
||||
except OSError as e:
|
||||
self.journal.error("preserve-"+preserve)
|
||||
|
||||
def __get_string_size(self, size):
|
||||
|
@ -543,69 +540,69 @@ class GCP():
|
|||
@return: a tuple (boolean, message) where the boolean is the success of the arguments
|
||||
validation, and message is the error message to print when necessary"""
|
||||
_usage="""
|
||||
%prog [options] FILE DEST
|
||||
%prog [options] FILE1 [FILE2 ...] DEST-DIR
|
||||
%(prog)s [options] FILE DEST
|
||||
%(prog)s [options] FILE1 [FILE2 ...] DEST-DIR
|
||||
|
||||
%prog --help for options list
|
||||
%(prog)s --help for options list
|
||||
"""
|
||||
for idx in range(len(full_args)):
|
||||
if isinstance(full_args[idx], unicode):
|
||||
#We don't want unicode as some filenames can be invalid unicode
|
||||
full_args[idx] = full_args[idx].encode('utf-8')
|
||||
full_args[idx] = full_args[idx].encode('utf-8')
|
||||
|
||||
parser = OptionParser(usage=_usage,version=ABOUT)
|
||||
parser = ArgumentParser(usage=_usage)
|
||||
|
||||
parser.add_option("-r", "--recursive", action="store_true", default=False,
|
||||
parser.add_argument("-r", "--recursive", action="store_true", default=False,
|
||||
help=_("copy directories recursively"))
|
||||
|
||||
parser.add_option("-f", "--force", action="store_true", default=False,
|
||||
parser.add_argument("-f", "--force", action="store_true", default=False,
|
||||
help=_("force overwriting of existing files"))
|
||||
|
||||
parser.add_option("--preserve", action="store", default='mode,ownership,timestamps',
|
||||
parser.add_argument("--preserve", action="store", default='mode,ownership,timestamps',
|
||||
help=_("preserve the specified attributes"))
|
||||
|
||||
parser.add_option("-L", "--dereference", action="store_true", default=False,
|
||||
parser.add_argument("-L", "--dereference", action="store_true", default=False,
|
||||
help=_("always follow symbolic links in sources"))
|
||||
|
||||
parser.add_option("-P", "--no-dereference", action="store_false", dest='dereference',
|
||||
parser.add_argument("-P", "--no-dereference", action="store_false", dest='dereference',
|
||||
help=_("never follow symbolic links in sources"))
|
||||
|
||||
#parser.add_option("--no-unicode-fix", action="store_false", dest='unicode_fix', default=True,
|
||||
#parser.add_argument("--no-unicode-fix", action="store_false", dest='unicode_fix', default=True,
|
||||
# help=_("don't fix name encoding errors")) #TODO
|
||||
|
||||
parser.add_option("--no-fs-fix", action="store_false", dest='fs_fix', default=True,
|
||||
parser.add_argument("--no-fs-fix", action="store_false", dest='fs_fix', default=True,
|
||||
help=_("don't fix filesystem name incompatibily"))
|
||||
|
||||
parser.add_option("--no-progress", action="store_false", dest="progress", default=True,
|
||||
parser.add_argument("--no-progress", action="store_false", dest="progress", default=True,
|
||||
help=_("deactivate progress bar"))
|
||||
|
||||
parser.add_option("-v", "--verbose", action="store_true", default=False,
|
||||
parser.add_argument("-v", "--verbose", action="store_true", default=False,
|
||||
help=_("Show what is currently done"))
|
||||
|
||||
group_saving = OptionGroup(parser, "sources saving")
|
||||
parser.add_argument("-V", "--version", action="version", version=ABOUT)
|
||||
|
||||
group_saving.add_option("--sources-save", action="store",
|
||||
group_saving = parser.add_argument_group("sources saving")
|
||||
|
||||
group_saving.add_argument("--sources-save", action="store",
|
||||
help=_("Save source arguments"))
|
||||
|
||||
group_saving.add_option("--sources-replace", action="store",
|
||||
group_saving.add_argument("--sources-replace", action="store",
|
||||
help=_("Save source arguments and replace memory if it already exists"))
|
||||
|
||||
group_saving.add_option("--sources-load", action="store",
|
||||
group_saving.add_argument("--sources-load", action="store",
|
||||
help=_("Load source arguments"))
|
||||
|
||||
group_saving.add_option("--sources-del", action="store",
|
||||
group_saving.add_argument("--sources-del", action="store",
|
||||
help=_("delete saved sources"))
|
||||
|
||||
group_saving.add_option("--sources-list", action="store_true", default=False,
|
||||
group_saving.add_argument("--sources-list", action="store_true", default=False,
|
||||
help=_("List names of saved sources"))
|
||||
|
||||
group_saving.add_option("--sources-full-list", action="store_true", default=False,
|
||||
group_saving.add_argument("--sources-full-list", action="store_true", default=False,
|
||||
help=_("List names of saved sources and files in it"))
|
||||
|
||||
parser.add_option_group(group_saving)
|
||||
parser.add_argument_group(group_saving)
|
||||
|
||||
|
||||
(options, args) = parser.parse_args(full_args)
|
||||
(options, args) = parser.parse_known_args()
|
||||
options.directdir = False #True only in the special case: we are copying a dir and it doesn't exists
|
||||
#options check
|
||||
if options.progress and not pbar_available:
|
||||
|
@ -653,7 +650,7 @@ class GCP():
|
|||
if len(args) < 2:
|
||||
_error_msg = _("Wrong number of arguments")
|
||||
return (False, _error_msg)
|
||||
debug(_("adding args to gcp: %s") % str(args).decode('utf-8','replace'))
|
||||
debug(_("adding args to gcp: %s") % args)
|
||||
self.__checkArgs(options, source_dir, args)
|
||||
if not self.__launched:
|
||||
self.journal = Journal()
|
||||
|
|
7
setup.py
7
setup.py
|
@ -1,9 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from distribute_setup import use_setuptools
|
||||
use_setuptools()
|
||||
|
||||
from setuptools import setup
|
||||
import sys
|
||||
from os import path
|
||||
|
@ -24,8 +21,8 @@ setup(name=name,
|
|||
'Programming Language :: Python',
|
||||
'Topic :: Utilities'
|
||||
],
|
||||
data_files=[(path.join(sys.prefix,'share/locale/fr/LC_MESSAGES'), ['i18n/fr/LC_MESSAGES/gcp.mo']),
|
||||
data_files=[('share/locale/fr/LC_MESSAGES', ['i18n/fr/LC_MESSAGES/gcp.mo']),
|
||||
('share/man/man1', ["gcp.1"]),
|
||||
('share/doc/%s' % name, ['COPYING','README'])],
|
||||
('share/doc/%s' % name, ['COPYING','README.rst'])],
|
||||
scripts=['gcp'],
|
||||
)
|
||||
|
|
Loading…
Reference in New Issue