[csw-devel] SF.net SVN: gar:[12515] csw/mgar/gar/v2-fortran

gadavis at users.sourceforge.net gadavis at users.sourceforge.net
Wed Jan 12 20:26:12 CET 2011


Revision: 12515
          http://gar.svn.sourceforge.net/gar/?rev=12515&view=rev
Author:   gadavis
Date:     2011-01-12 19:26:11 +0000 (Wed, 12 Jan 2011)

Log Message:
-----------
Merge in changes from v2 trunk r11821:HEAD to v2-fortran

Modified Paths:
--------------
    csw/mgar/gar/v2-fortran/bin/checkpkg
    csw/mgar/gar/v2-fortran/bin/checkpkg_inspect_stats.py
    csw/mgar/gar/v2-fortran/bin/cpan_apply_updates
    csw/mgar/gar/v2-fortran/bin/custom-pkgtrans
    csw/mgar/gar/v2-fortran/bin/depgraph
    csw/mgar/gar/v2-fortran/bin/gem2pkg
    csw/mgar/gar/v2-fortran/bin/mkpackage
    csw/mgar/gar/v2-fortran/bin/update-commondirs
    csw/mgar/gar/v2-fortran/categories/cpan/category.mk
    csw/mgar/gar/v2-fortran/categories/java/category.mk
    csw/mgar/gar/v2-fortran/categories/kde/category.mk
    csw/mgar/gar/v2-fortran/categories/kde4/category.mk
    csw/mgar/gar/v2-fortran/categories/rbgems/category.mk
    csw/mgar/gar/v2-fortran/categories/xfce/category.mk
    csw/mgar/gar/v2-fortran/etc/commondirs-i386
    csw/mgar/gar/v2-fortran/etc/commondirs-sparc
    csw/mgar/gar/v2-fortran/gar.conf.mk
    csw/mgar/gar/v2-fortran/gar.lib.mk
    csw/mgar/gar/v2-fortran/gar.mk
    csw/mgar/gar/v2-fortran/gar.pkg.mk
    csw/mgar/gar/v2-fortran/gar.svn.mk
    csw/mgar/gar/v2-fortran/lib/python/README
    csw/mgar/gar/v2-fortran/lib/python/catalog.py
    csw/mgar/gar/v2-fortran/lib/python/checkpkg.py
    csw/mgar/gar/v2-fortran/lib/python/checkpkg_test.py
    csw/mgar/gar/v2-fortran/lib/python/compare_pkgs.py
    csw/mgar/gar/v2-fortran/lib/python/configuration.py
    csw/mgar/gar/v2-fortran/lib/python/database.py
    csw/mgar/gar/v2-fortran/lib/python/dependency_checks.py
    csw/mgar/gar/v2-fortran/lib/python/dependency_checks_test.py
    csw/mgar/gar/v2-fortran/lib/python/gartest.py
    csw/mgar/gar/v2-fortran/lib/python/inspective_package.py
    csw/mgar/gar/v2-fortran/lib/python/models.py
    csw/mgar/gar/v2-fortran/lib/python/opencsw.py
    csw/mgar/gar/v2-fortran/lib/python/opencsw_test.py
    csw/mgar/gar/v2-fortran/lib/python/overrides.py
    csw/mgar/gar/v2-fortran/lib/python/package.py
    csw/mgar/gar/v2-fortran/lib/python/package_checks.py
    csw/mgar/gar/v2-fortran/lib/python/package_checks_test.py
    csw/mgar/gar/v2-fortran/lib/python/package_stats.py
    csw/mgar/gar/v2-fortran/lib/python/package_stats_test.py
    csw/mgar/gar/v2-fortran/lib/python/package_test.py
    csw/mgar/gar/v2-fortran/lib/python/pkgdb.py
    csw/mgar/gar/v2-fortran/lib/python/sharedlib_utils.py
    csw/mgar/gar/v2-fortran/lib/python/sharedlib_utils_test.py
    csw/mgar/gar/v2-fortran/lib/python/tag.py
    csw/mgar/gar/v2-fortran/lib/python/tag_test.py
    csw/mgar/gar/v2-fortran/lib/python/testdata/neon_stats.py
    csw/mgar/gar/v2-fortran/lib/python/testdata/tree_stats.py
    csw/mgar/gar/v2-fortran/lib/sh/libcheckpkg.sh
    csw/mgar/gar/v2-fortran/pkglib/Makefile
    csw/mgar/gar/v2-fortran/tests/example_test.py
    csw/mgar/gar/v2-fortran/tests/overrides_test.py
    csw/mgar/gar/v2-fortran/tests/run_tests.py
    csw/mgar/gar/v2-fortran/tests/static/example/Makefile

Added Paths:
-----------
    csw/mgar/gar/v2-fortran/bin/comparepkg
    csw/mgar/gar/v2-fortran/lib/python/checkpkg2.py
    csw/mgar/gar/v2-fortran/lib/python/checkpkg_defaults.ini
    csw/mgar/gar/v2-fortran/lib/python/checkpkg_lib.py
    csw/mgar/gar/v2-fortran/lib/python/checkpkg_lib_test.py
    csw/mgar/gar/v2-fortran/lib/python/common_constants.py
    csw/mgar/gar/v2-fortran/lib/python/database_test.py
    csw/mgar/gar/v2-fortran/lib/python/inspective_package_test.py
    csw/mgar/gar/v2-fortran/lib/python/ldd_emul.py
    csw/mgar/gar/v2-fortran/lib/python/ldd_emul_test.py
    csw/mgar/gar/v2-fortran/lib/python/models_test.py
    csw/mgar/gar/v2-fortran/lib/python/mute_progressbar.py
    csw/mgar/gar/v2-fortran/lib/python/pkgdb_test.py
    csw/mgar/gar/v2-fortran/lib/python/pkgmap.py
    csw/mgar/gar/v2-fortran/lib/python/pkgmap_test.py
    csw/mgar/gar/v2-fortran/lib/python/pylintrc
    csw/mgar/gar/v2-fortran/lib/python/shell.py
    csw/mgar/gar/v2-fortran/lib/python/struct_util.py
    csw/mgar/gar/v2-fortran/lib/python/struct_util_test.py
    csw/mgar/gar/v2-fortran/lib/python/system_pkgmap.py
    csw/mgar/gar/v2-fortran/lib/python/system_pkgmap_test.py
    csw/mgar/gar/v2-fortran/lib/python/test_base.py
    csw/mgar/gar/v2-fortran/lib/sh/db_privileges.sh

Removed Paths:
-------------
    csw/mgar/gar/v2-fortran/bin/analyze_module_results.py
    csw/mgar/gar/v2-fortran/bin/checkpkg_collect_stats.py
    csw/mgar/gar/v2-fortran/bin/checkpkg_run_modules.py
    csw/mgar/gar/v2-fortran/lib/sh/run_full_cat.sh

Property Changed:
----------------
    csw/mgar/gar/v2-fortran/
    csw/mgar/gar/v2-fortran/bin/checkpkg
    csw/mgar/gar/v2-fortran/lib/python/package_stats_test.py
    csw/mgar/gar/v2-fortran/lib/python/package_test.py
    csw/mgar/gar/v2-fortran/pkglib/csw/depend


Property changes on: csw/mgar/gar/v2-fortran
___________________________________________________________________
Modified: svn:mergeinfo
   - /csw/mgar/gar/v2:4936-6678,10883-11818
/csw/mgar/gar/v2-bwalton:9784-10011
/csw/mgar/gar/v2-checkpkg:7722-7855
/csw/mgar/gar/v2-checkpkg-override-relocation:10585-10737
/csw/mgar/gar/v2-checkpkg-stats:8454-8649
/csw/mgar/gar/v2-collapsed-modulations:6895
/csw/mgar/gar/v2-dirpackage:8125-8180
/csw/mgar/gar/v2-git/v2-relocate:7617
/csw/mgar/gar/v2-migrateconf:7082-7211
/csw/mgar/gar/v2-noexternals:11592-11745
/csw/mgar/gar/v2-relocate:5028-11738
/csw/mgar/gar/v2-skayser:6087-6132
/csw/mgar/gar/v2-sqlite:10434-10449
   + /csw/mgar/gar/v2:4936-6678,10883-11818,11822-12514
/csw/mgar/gar/v2-bwalton:9784-10011
/csw/mgar/gar/v2-checkpkg:7722-7855
/csw/mgar/gar/v2-checkpkg-override-relocation:10585-10737
/csw/mgar/gar/v2-checkpkg-stats:8454-8649
/csw/mgar/gar/v2-collapsed-modulations:6895
/csw/mgar/gar/v2-dirpackage:8125-8180
/csw/mgar/gar/v2-git/v2-relocate:7617
/csw/mgar/gar/v2-migrateconf:7082-7211
/csw/mgar/gar/v2-noexternals:11592-11745
/csw/mgar/gar/v2-relocate:5028-11738
/csw/mgar/gar/v2-skayser:6087-6132
/csw/mgar/gar/v2-sqlite:10434-10449

Deleted: csw/mgar/gar/v2-fortran/bin/analyze_module_results.py
===================================================================
--- csw/mgar/gar/v2-fortran/bin/analyze_module_results.py	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/analyze_module_results.py	2011-01-12 19:26:11 UTC (rev 12515)
@@ -1,73 +0,0 @@
-#!/opt/csw/bin/python2.6
-# $Id$
-
-import itertools
-import operator
-import optparse
-import os
-import pprint
-import progressbar
-import sys
-import textwrap
-
-# The following bit of code sets the correct path to Python libraries
-# distributed with GAR.
-path_list = [os.path.dirname(__file__),
-             "..", "lib", "python"]
-sys.path.append(os.path.join(*path_list))
-import checkpkg
-import overrides
-import package_stats
-
-BEFORE_OVERRIDES = """If any of the reported errors were false positives, you
-can override them pasting the lines below to the GAR recipe."""
-
-AFTER_OVERRIDES = """Please note that checkpkg isn't suggesting you should
-simply add these overrides do the Makefile.  It only informs what the overrides
-could look like.  You need to understand what are the reported issues about and
-use your best judgement to decide whether to fix the underlying problems or
-override them. For more information, scroll up and read the detailed
-messages."""
-
-UNAPPLIED_OVERRIDES = """WARNING: Some overrides did not match any errors.
-They can be removed, as they don't take any effect anyway.  If you're getting
-errors at the same time, maybe you didn't specify the overrides correctly."""
-
-def main():
-  parser = optparse.OptionParser()
-  parser.add_option("-c", "--catalog_file", dest="catalog",
-                    help="Optional catalog file")
-  parser.add_option("-q", "--quiet", dest="quiet",
-                    default=False, action="store_true",
-                    help=("Display less messages"))
-  options, args = parser.parse_args()
-  filenames = args
-
-  # This might be bottleneck.  Perhaps a list of md5 sums can be given to this
-  # script instead.
-
-  # It might be a good idea to store the error tags in the database and
-  # eliminate the need to access the directory with the error tag files.
-
-  pkgstats = package_stats.StatsListFromCatalog(filenames, options.catalog)
-  overrides_list = [pkg.GetSavedOverrides() for pkg in pkgstats]
-  override_list = reduce(operator.add, overrides_list)
-  error_tags = reduce(operator.add, [stat.GetSavedErrorTags() for stat in pkgstats])
-  (tags_after_overrides,
-   unapplied_overrides) = overrides.ApplyOverrides(error_tags, override_list)
-  if not options.quiet:
-    if tags_after_overrides:
-      print textwrap.fill(BEFORE_OVERRIDES, 80)
-      for checkpkg_tag in tags_after_overrides:
-        print checkpkg_tag.ToGarSyntax()
-      print textwrap.fill(AFTER_OVERRIDES, 80)
-    if unapplied_overrides:
-      print textwrap.fill(UNAPPLIED_OVERRIDES, 80)
-      for override in unapplied_overrides:
-        print "* Unused %s" % override
-  exit_code = bool(tags_after_overrides)
-  sys.exit(exit_code)
-
-
-if __name__ == '__main__':
-  main()

Modified: csw/mgar/gar/v2-fortran/bin/checkpkg
===================================================================
--- csw/mgar/gar/v2-fortran/bin/checkpkg	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/checkpkg	2011-01-12 19:26:11 UTC (rev 12515)
@@ -1,322 +1 @@
-#!/bin/ksh -p
-# 
-# $Id$
-#
-# checkpkg 1.51
-#
-# diff to 1.46a
-#  - check multiple package files
-#  - checkpkg.d plugin support
-#  - getopts support for command line options
-#  - colors
-#  - modular architecture + unit tests
-#  - reliable shared library checking
-#
-# This script examines a package that has been put together
-# for submittal to the CSW archive at opencsw.org
-#
-# It examines it for compliance with the packaging standards at
-# http://www.opencsw.org/standards/
-# It DOES NOT CATCH EVERYTHING. However, the package will be
-# tested with this script before acceptance, so you may as well
-# save yourself some time, and run the script yourself!
-#
-# Be sure to occasionally do a "pkg-get update cswutils" so that
-# you know you are tracking the most current version.
-# 
-# TODO:
-# - add message handlig to the CheckInterface class.
-#
-
-PATH=$PATH:/usr/sbin
-readonly NAME_MAX_LENGTH=${NAME_MAX_LENGTH:-20}
-
-command_basename=`basename $0`
-command_basedir="${0%/${command_basename}}"
-libshdir="${command_basedir}/../lib/sh"
-readonly command_basename command_basedir libshdir
-. "${libshdir}/libcheckpkg.sh"
-
-LOCAL_ARCH=`uname -p`
-CHECKPKG_TMPDIR=${CHECKPKG_TMPDIR:-/var/tmp}
-readonly CHECKPKG_TMPDIR
-
-# Colors only when running interactively
-if [[ -t 1 ]]; then
-	GREEN="\\033[0;32;40m"
-	RED="\\033[1;31;40m"
-	BOLD="\\033[1m"
-	COLOR_RESET="\\033[00m"
-else
-	GREEN=""
-	RED=""
-	BOLD=""
-	COLOR_RESET=""
-fi
-readonly GREEN RED BOLD COLOR_RESET
-
-readonly selfpath="$0"
-readonly selfargs="$@"
-
-cleanup(){
-	if [[ -d "$EXTRACTDIR" ]] ; then
-		rm -rf $EXTRACTDIR
-	fi
-	cleantmparchives
-}
-
-cleantmparchives() {
-	for TMPARCHIVE in $tmparchives; do
-		if [[ "$TMPARCHIVE" != "" ]]; then
-			[ -f "$TMPARCHIVE" ] && rm $TMPARCHIVE
-		fi
-	done
-}
-
-cleanupset(){
-    if [ "`echo $SETINF*`" != "$SETINF*" ]; then
-	rm $SETINF*
-    fi
-}
-
-# Print error message, and quit program.
-errmsg(){
-	print ERROR: $* >/dev/fd/2
-	cleanup
-	cleanupset
-	print "To run checkpkg in the debug mode, add the '-d' flag, for example:"
-  # selfargs can be very, very long. Find a way to truncate it.
-	# print "${selfpath} -d ${selfargs}"
-	print "After you modify any overrides, you need to do gmake remerge repackage"
-	print "or gmake platforms-remerge platforms-repackage."
-	exit 1
-}
-
-debugmsg() {
-	if [[ "${DEBUG}" != "" ]]; then
-		print "DEBUG: $*" > /dev/fd/2
-	fi
-}
-
-# TODO: Options to add:
-#  - Use an pre-cached (from a catalog file?) list of md5 sums
-#  - Don't use the data from /var/sadm/install/contents
-display_help=0
-SKIP_STATS_COLLECTION=0
-MD5_SUMS_CATALOG_FILE=""
-INSTALL_CONTENTS_FILES="/var/sadm/install/contents"
-ANALYZE=1
-PROFILE=0
-QUIET=0
-
-while getopts hsdNM:o:c:Apq opt; do
-	case "${opt}" in
-	  c)
-	    INSTALL_CONTENTS_FILES="${INSTALL_CONTENTS_FILES} ${OPTARG}"
-	    ;;
-    d)
-      DEBUG=1
-      ;;
-    h)
-      display_help=1
-      ;;
-    N)
-      SKIP_STATS_COLLECTION=1
-      ;;
-    M)
-      MD5_SUMS_CATALOG_FILE="${OPTARG}"
-      ;;
-    A)
-      ANALYZE=0
-      ;;
-    p)
-      PROFILE=1
-      ;;
-    q) QUIET=1
-      ;;
-    *)
-      echo "Unknown option '${opt}'"
-      ;;
-  esac
-done
-shift $(( $OPTIND -1 ))
-
-readonly INSTALL_CONTENTS_FILES
-readonly MD5_SUMS_CATALOG_FILE
-readonly SKIP_STATS_COLLECTION
-readonly ANALYZE
-readonly PROFILE
-readonly QUIET
-
-if [[ "${display_help}" -eq 1 ]] ; then
-  print 'Usage: checkpkg [options] pkg1 [pkg2 ....]'
-  print 'Options:'
-  print '   -c <file>  use an additional install/contents file'
-  print '   -d         display debug messages'
-  print '   -N         skip statistics collection'
-  print '   -M <file>  use package md5sums from a catalog file'
-  print '   -A         Do not analyze the results.'
-  print '   -p         Enable profiling'
-  print '   -q         Display less messages'
-  print ''
-  print 'Error tags are saved to the sqlite database.'
-  exit 0
-fi
-
-# a unique filename for the list of package deps and libs we see in a 'set'
-SETINF=$CHECKPKG_TMPDIR/checkpkg.$$.`date +%Y%m%d%H%M%S`
-SETLIBS=$SETINF.libs
-SETDEPS=$SETINF.deps
-pkgnames=""
-tmparchives=""
-
-EXTRACTDIR=$CHECKPKG_TMPDIR/dissect.$$
-
-if [ -d $EXTRACTDIR ] ; then
-	errmsg ERROR: $EXTRACTDIR already exists
-fi
-
-for f in "$@"
-do
-
-  if [[ ! -f $f ]] ; then
-    errmsg ERROR: $f does not exist
-  fi
-
-
-[ -d ${EXTRACTDIR} ] || mkdir ${EXTRACTDIR}
-
-########################################
-# Check for some common errors
-#########################################
-
-# TODO: To be ported.
-#
-# # find all executables and dynamic libs,and list their filenames.
-# if [[ "$basedir" != "" ]] ; then
-# 	print
-# 	if [[ -f $EXTRACTDIR/elflist ]] ; then
-# 		print "Checking relocation ability..."
-# 		xargs strings < $EXTRACTDIR/elflist| grep /opt/csw
-# 		if [[ $? -eq 0 ]] ; then
-# 			errmsg package build as relocatable, but binaries have hardcoded /opt/csw paths in them
-# 		else
-# 			print trivial check passed
-# 		fi
-# 	else
-# 		echo No relocation check done for non-binary relocatable package.
-# 	fi
-# fi
-
-tmparchives="$tmparchives $TMPARCHIVE"
-done
-
-# Plugin section.  This is here for support for other programming languages
-# than Python.  As of 2010-03-16 there are no checks in there.  If this keeps
-# empty, if no checks in other languages get written, it could be removed.
-#
-# Plugins should live in checkpkg.d subdirectory in the same directory in which
-# checkpkg is.  Each plugin file name should be an executable and begin with
-# "checkpkg-".
-
-test_suite_ok=1
-checkpkg_module_dir="${command_basedir}/../lib/checkpkg.d"
-checkpkg_module_tag="checkpkg-"
-checkpkg_stats_basedir="${HOME}/.checkpkg/stats"
-
-# Cleaning up old *.pyc files which can cause grief.  This is because of the
-# move of Python libraries.
-for pyc_file in ${checkpkg_module_dir}/opencsw.pyc \
-                ${checkpkg_module_dir}/checkpkg.pyc; do
-  if [ -f "${pyc_file}" ]; then
-    echo "Removing old pyc file: '${pyc_file}'"
-    rm "${pyc_file}"
-  fi
-done
-
-if [[ "${DEBUG}" != "" ]]; then
-	extra_options="--debug"
-fi
-if [[ "${PROFILE}" -eq 1 ]]; then
-	extra_options="${extra_options} --profile"
-fi
-if [[ "${QUIET}" -eq 1 ]]; then
-	quiet_options="--quiet"
-else
-	quiet_options=""
-fi
-
-if [[ -n "${MD5_SUMS_CATALOG_FILE}" ]]; then
-	catalog_options="--catalog=${MD5_SUMS_CATALOG_FILE}"
-else
-	catalog_options=""
-fi
-
-# /var/sadm/install/contents cache update
-# TODO: Either remove this section or stop the stats collection phase from
-# updating the cache.
-${command_basedir}/update_contents_cache.py ${extra_options}
-if [[ $? -ne 0 ]]; then
-	errmsg "Updating the contents cache has failed."
-fi
-if [[ "${SKIP_STATS_COLLECTION}" -eq 0 ]]; then
-  # Collects package stats to be analyzed later
-  ${command_basedir}/checkpkg_collect_stats.py \
-      ${catalog_options} \
-      ${extra_options} \
-      "$@"
-  if [[ "$?" -ne 0 ]]; then
-    errmsg "Stats collection phase has failed."
-  fi
-fi
-
-# TODO: A performance problem. The following line means that the md5sums are
-# calculated once more.
-if [ "${MD5_SUMS_CATALOG_FILE}" ]; then
-	debugmsg "Reading md5sums from ${MD5_SUMS_CATALOG_FILE}"
-	md5sums=`cat "${MD5_SUMS_CATALOG_FILE}" \
-	    | awk '{print $5}' \
-	    | ggrep -E '[0-9abcdef]{32}'`
-else
-  debugmsg "Calculating md5 sums of all the package files."
-  md5sums=`gmd5sum "$@" | awk '{print $1}'`
-fi
-debugmsg "All md5 sums: ${md5sums}"
-
-# Running the checks.
-${command_basedir}/checkpkg_run_modules.py \
-    ${extra_options} \
-    -b "${checkpkg_stats_basedir}" \
-    ${quiet_options} \
-    ${md5sums}
-if [[ "$?" -ne 0 ]]; then
-  print "There was a problem analyzing package stats."
-  test_suite_ok=0
-fi
-
-if [[ ${test_suite_ok} -ne 1 ]]; then
-	errmsg "One or more tests have finished with an error."
-fi
-
-if [[ "${ANALYZE}" -eq 1 ]]; then
-# Collecting errors and applying the overrides.
-# This has to use the original files.
-  ${command_basedir}/analyze_module_results.py \
-      ${catalog_options} \
-      ${quiet_options} \
-      "$@"
-  if [[ "$?" -ne 0 ]]; then
-    errmsg "${RED}Checkpkg has reported errors.${COLOR_RESET}"
-  else
-    print "${GREEN}Checkpkg reports no errors.${COLOR_RESET}"
-  fi
-else
-	echo "Skipping result analysis."
-fi
-
-print ""
-
-# Cleaning up after all packages
-cleanup
-
-cleanupset
+link ../lib/python/checkpkg2.py
\ No newline at end of file


Property changes on: csw/mgar/gar/v2-fortran/bin/checkpkg
___________________________________________________________________
Deleted: svn:executable
   - *
Added: svn:special
   + *

Deleted: csw/mgar/gar/v2-fortran/bin/checkpkg_collect_stats.py
===================================================================
--- csw/mgar/gar/v2-fortran/bin/checkpkg_collect_stats.py	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/checkpkg_collect_stats.py	2011-01-12 19:26:11 UTC (rev 12515)
@@ -1,69 +0,0 @@
-#!/opt/csw/bin/python2.6
-#
-# $Id$
-#
-# Collects statistics about a package and saves to a directory, for later use
-# by checkpkg modules.
-
-import itertools
-import logging
-import optparse
-import os
-import os.path
-import subprocess
-import sys
-import progressbar
-
-# The following bit of code sets the correct path to Python libraries
-# distributed with GAR.
-path_list = [os.path.dirname(__file__),
-             "..", "lib", "python"]
-sys.path.append(os.path.join(*path_list))
-import checkpkg
-import opencsw
-import package_stats
-
-def main():
-  parser = optparse.OptionParser()
-  parser.add_option("-d", "--debug", dest="debug",
-                    default=False, action="store_true",
-                    help="Turn on debugging messages")
-  parser.add_option("-c", "--catalog", dest="catalog",
-                    help="Catalog file")
-  parser.add_option("-p", "--profile", dest="profile",
-                    default=False, action="store_true",
-                    help="A disabled option")
-  options, args = parser.parse_args()
-  if options.debug:
-    logging.basicConfig(level=logging.DEBUG)
-  else:
-    logging.basicConfig(level=logging.INFO)
-  logging.debug("Collecting statistics about given package files.")
-  args_display = args
-  if len(args_display) > 5:
-    args_display = args_display[:5] + ["...more..."]
-  file_list = args
-  logging.debug("Processing: %s, please be patient", args_display)
-  stats_list = package_stats.StatsListFromCatalog(
-      file_list, options.catalog, options.debug)
-  # Reversing the item order in the list, so that the pop() method can be used
-  # to get packages, and the order of processing still matches the one in the
-  # catalog file.
-  stats_list.reverse()
-  total_packages = len(stats_list)
-  counter = itertools.count(1)
-  logging.info("Juicing the srv4 package stream files...")
-  bar = progressbar.ProgressBar()
-  bar.maxval = total_packages
-  bar.start()
-  while stats_list:
-    # This way objects will get garbage collected as soon as they are removed
-    # from the list by pop().  The destructor (__del__()) of the srv4 class
-    # removes the temporary directory from the disk.  This allows to process
-    # the whole catalog.
-    stats_list.pop().CollectStats()
-    bar.update(counter.next())
-  bar.finish()
-
-if __name__ == '__main__':
-  main()

Modified: csw/mgar/gar/v2-fortran/bin/checkpkg_inspect_stats.py
===================================================================
--- csw/mgar/gar/v2-fortran/bin/checkpkg_inspect_stats.py	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/checkpkg_inspect_stats.py	2011-01-12 19:26:11 UTC (rev 12515)
@@ -18,8 +18,10 @@
 sys.path.append(os.path.join(*path_list))
 import checkpkg
 import opencsw
+import configuration
 
 def main():
+  configuration.SetUpSqlobjectConnection()
   usage = "Usage: %prog [ options ] file | md5 [ file | md5 [ ... ] ]"
   parser = optparse.OptionParser(usage)
   parser.add_option("-d", "--debug", dest="debug",

Deleted: csw/mgar/gar/v2-fortran/bin/checkpkg_run_modules.py
===================================================================
--- csw/mgar/gar/v2-fortran/bin/checkpkg_run_modules.py	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/checkpkg_run_modules.py	2011-01-12 19:26:11 UTC (rev 12515)
@@ -1,56 +0,0 @@
-#!/opt/csw/bin/python2.6
-# $Id$
-
-"""This script runs all the checks written in Python."""
-
-import datetime
-import logging
-import os
-import os.path
-import sys
-import re
-import cProfile
-
-CHECKPKG_MODULE_NAME = "Second checkpkg API version"
-
-# The following bit of code sets the correct path to Python libraries
-# distributed with GAR.
-path_list = [os.path.dirname(__file__),
-             "..", "lib", "python"]
-sys.path.append(os.path.join(*path_list))
-import checkpkg
-import opencsw
-
-
-def main():
-  options, args = checkpkg.GetOptions()
-  if options.debug:
-    logging.basicConfig(level=logging.DEBUG)
-  else:
-    logging.basicConfig(level=logging.INFO)
-  md5sums = args
-  # CheckpkgManager2 class abstracts away things such as the collection of
-  # results.
-  check_manager = checkpkg.CheckpkgManager2(CHECKPKG_MODULE_NAME,
-                                            options.stats_basedir,
-                                            md5sums,
-                                            options.debug)
-  # Running the checks, reporting and exiting.
-  exit_code, screen_report, tags_report = check_manager.Run()
-  screen_report = unicode(screen_report)
-  if not options.quiet and screen_report:
-    sys.stdout.write(screen_report)
-  else:
-    logging.debug("No screen report.")
-  sys.exit(exit_code)
-
-
-if __name__ == '__main__':
-  if "--profile" in sys.argv:
-    t_str = datetime.datetime.now().strftime("%Y-%m-%d-%H-%M")
-    home = os.environ["HOME"]
-    cprof_file_name = os.path.join(
-        home, ".checkpkg", "run-modules-%s.cprof" % t_str)
-    cProfile.run("main()", sort=1, filename=cprof_file_name)
-  else:
-    main()

Copied: csw/mgar/gar/v2-fortran/bin/comparepkg (from rev 12514, csw/mgar/gar/v2/bin/comparepkg)
===================================================================
--- csw/mgar/gar/v2-fortran/bin/comparepkg	                        (rev 0)
+++ csw/mgar/gar/v2-fortran/bin/comparepkg	2011-01-12 19:26:11 UTC (rev 12515)
@@ -0,0 +1 @@
+link ../lib/python/compare_pkgs.py
\ No newline at end of file

Modified: csw/mgar/gar/v2-fortran/bin/cpan_apply_updates
===================================================================
--- csw/mgar/gar/v2-fortran/bin/cpan_apply_updates	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/cpan_apply_updates	2011-01-12 19:26:11 UTC (rev 12515)
@@ -28,7 +28,7 @@
         next;
     }
     print "Updating $module to $newvers\n";
-    my $rpat = "s/^(GARVERSION).*\$/\$1 = $newvers/";
+    my $rpat = "s/^(VERSION).*\$/\$1 = $newvers/";
     system("perl -i.bak -plne '$rpat' $module/Makefile")
         and die "Failed to upgrade $module\n";
     system("gmake -C $module update")

Modified: csw/mgar/gar/v2-fortran/bin/custom-pkgtrans
===================================================================
--- csw/mgar/gar/v2-fortran/bin/custom-pkgtrans	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/custom-pkgtrans	2011-01-12 19:26:11 UTC (rev 12515)
@@ -1,10 +1,15 @@
-#!/bin/ksh -p
+#!/bin/bash
 # 
 # $Id$
 #
 # This file exists in order to avoid implementing pipelines in Python.  It
 # could be integrated into the package stats collection program.
+#
+# It has to use the same interpreter as lib/sh/libcheckpkg.sh, currently bash.
 
+set -u
+set -e
+
 command_basename=`basename $0`
 command_basedir="${0%/${command_basename}}"
 libshdir="${command_basedir}/../lib/sh"
@@ -12,11 +17,11 @@
 . "${libshdir}/libcheckpkg.sh"
 
 if [[ -z "$1" || -z "$2" || -z "$3" ]]; then
-	print >&2 "usage: $0 <file.pkg> <targetdir> <pkgname>"
+	echo >&2 "usage: $0 <file.pkg> <targetdir> <pkgname>"
 	exit 1
 fi
 if [[ "$3" == "all" ]]; then
-  print >&2 "This script can't handle 'all' as the third argument"
+  echo >&2 "This script can't handle 'all' as the third argument"
   exit 1
 fi
 custom_pkgtrans "$1" "$2" "$3"

Modified: csw/mgar/gar/v2-fortran/bin/depgraph
===================================================================
--- csw/mgar/gar/v2-fortran/bin/depgraph	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/depgraph	2011-01-12 19:26:11 UTC (rev 12515)
@@ -49,7 +49,7 @@
 
             my ($garname, @deps);
             foreach (@lines) {
-                if (/^GARNAME\s*=\s*(\S*)\s*$/) {
+                if (/^NAME\s*=\s*(\S*)\s*$/) {
                     $garname = $1;
                     next;
                 }

Modified: csw/mgar/gar/v2-fortran/bin/gem2pkg
===================================================================
--- csw/mgar/gar/v2-fortran/bin/gem2pkg	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/gem2pkg	2011-01-12 19:26:11 UTC (rev 12515)
@@ -14,8 +14,8 @@
   blurb = spec.description.gsub("\n", ' ').squeeze.lstrip
 
   puts <<"EOF"
-GARNAME = #{spec.name}
-GARVERSION = #{spec.version}
+NAME = #{spec.name}
+VERSION = #{spec.version}
 CATEGORIES = rbgems
 
 DESCRIPTION = #{spec.summary}

Modified: csw/mgar/gar/v2-fortran/bin/mkpackage
===================================================================
--- csw/mgar/gar/v2-fortran/bin/mkpackage	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/mkpackage	2011-01-12 19:26:11 UTC (rev 12515)
@@ -25,6 +25,9 @@
 # Tool Version/Revision Information
 $TOOLVERSION = "1.4";
 ($REVISION) = q/$Revision$/ =~ /(\d+)/;
+# This shows a warning:
+# "Use of uninitialized value $REVISION in sprintf at
+# /home/maciej/src/opencsw/pkg/nspr/trunk/gar/bin/mkpackage line 31."
 $VERSION = sprintf '%s (r%d)', $TOOLVERSION, $REVISION;
 
 # Discover network support

Modified: csw/mgar/gar/v2-fortran/bin/update-commondirs
===================================================================
--- csw/mgar/gar/v2-fortran/bin/update-commondirs	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/bin/update-commondirs	2011-01-12 19:26:11 UTC (rev 12515)
@@ -19,7 +19,7 @@
   mkdir $TMPDIR
   (
     cd $TMPDIR
-    wget http://mirror.opencsw.org/opencsw/current/$1/5.8/common-1.4.7,REV=2009.09.20-SunOS5.8-$1-CSW.pkg
+    wget http://mirror.opencsw.org/opencsw/current/$1/5.9/common-1.5,REV=2010.12.11-SunOS5.8-$1-CSW.pkg
     cat common-* | pkgtrans /dev/fd/0 $TMPDIR all
   )
 

Modified: csw/mgar/gar/v2-fortran/categories/cpan/category.mk
===================================================================
--- csw/mgar/gar/v2-fortran/categories/cpan/category.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/categories/cpan/category.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -3,7 +3,7 @@
 MASTER_SITES ?= $(CPAN_MIRRORS)
 
 # This is common to most modules - override in module makefile if different
-MODDIST   ?= $(GARNAME)-$(GARVERSION).tar.gz
+MODDIST   ?= $(NAME)-$(VERSION).tar.gz
 DISTFILES += $(MODDIST)
 CHECKPATH ?= $(firstword $(CPAN_MIRRORS))
 
@@ -28,13 +28,13 @@
 SPKG_SOURCEURL := $(SPKG_SOURCEURL)/~$(call TOLOWER,$(AUTHOR))
 
 # We define upstream file regex so we can be notifed of new upstream software release
-UFILES_REGEX ?= $(GARNAME)-(\d+(?:\.\d+)*).tar.gz
+UFILES_REGEX ?= $(NAME)-(\d+(?:\.\d+)*).tar.gz
 USTREAM_MASTER_SITE ?= $(SPKG_SOURCEURL)
 
-$(foreach P,$(PACKAGES),$(eval _CATEGORY_SPKG_DESC_$P = $$(GARNAME): $$(or $$(SPKG_DESC_$P),$$(SPKG_DESC))))
-_CATEGORY_PKGINFO = echo "PERL_MODULE_NAME=$(GARNAME)";
+$(foreach P,$(PACKAGES),$(eval _CATEGORY_SPKG_DESC_$P = $$(NAME): $$(or $$(SPKG_DESC_$P),$$(SPKG_DESC))))
+_CATEGORY_PKGINFO = echo "PERL_MODULE_NAME=$(NAME)";
 
-SPKG_SOURCEURL := $(SPKG_SOURCEURL)/$(GARNAME)
+SPKG_SOURCEURL := $(SPKG_SOURCEURL)/$(NAME)
 
 _MERGE_EXCLUDE_CATEGORY = .*/perllocal\.pod .*/\.packlist
 _CATEGORY_GSPEC_INCLUDE ?= csw_cpan_dyngspec.gspec
@@ -48,6 +48,9 @@
 # upstream chose uppercase or not as case must match.
 _CATEGORY_CHECKPKG_OVERRIDES += pkginfo-description-not-starting-with-uppercase
 
+# Copy in META.yml if it exists so checkpkg can check Perl dependencies
+_CATEGORY_FILTER = | ( cat; if test -f "$(WORKDIR_GLOBAL)/META.yml";then echo "i cswpm-meta.yml=META.yml"; fi)
+
 include gar/gar.mk
 
 CONFIGURE_ENV += PERL5LIB=$(PERL5LIB)
@@ -89,25 +92,28 @@
 	( cd $* ; $(INSTALL_ENV) ./Build install $(PERLBUILD_INSTALL_ARGS) )
 	@$(MAKECOOKIE)
 
+pre-package:
+	test -f $(WORKSRC_FIRSTMOD)/META.yml && cp $(WORKSRC_FIRSTMOD)/META.yml $(WORKDIR_GLOBAL)
+
 # Check for a CPAN module version update
 update-check:
-	@echo " ==> Update Check: $(GARNAME) $(GARVERSION)"
+	@echo " ==> Update Check: $(NAME) $(VERSION)"
 	@if test "x$(MANUAL_UPDATE)" != "x0" ; then \
 	    cpan_check $(CHECKPATH)$(MODDIST) \
 	               $(CURDIR)/../update_results.txt ; \
 	else \
-	    echo " ==> AUTO UPDATE CHECK FOR $(GARNAME) IS DISABLED" ; \
+	    echo " ==> AUTO UPDATE CHECK FOR $(NAME) IS DISABLED" ; \
 	fi
 	
 # Print HTML info for modules
 module-info:
-	@echo " ==> Generating module info for $(GARNAME) $(GARVERSION)"
+	@echo " ==> Generating module info for $(NAME) $(VERSION)"
 	@printf "<a href=\"http://search.cpan.org/" \
 		>> ../module_info.html
 	@printf "~$(shell echo $(AUTHOR) | tr '[A-Z]' '[a-z]')/" \
 		>> ../module_info.html
-	@printf "$(GARNAME)-$(GARVERSION)" \
+	@printf "$(NAME)-$(VERSION)" \
 		>> ../module_info.html
-	@printf "\">$(GARNAME)-$(GARVERSION)</a><br/>\n" \
+	@printf "\">$(NAME)-$(VERSION)</a><br/>\n" \
 		>> ../module_info.html
 

Modified: csw/mgar/gar/v2-fortran/categories/java/category.mk
===================================================================
--- csw/mgar/gar/v2-fortran/categories/java/category.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/categories/java/category.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -1,7 +1,7 @@
 # http://jakarta.apache.org/commons
 
 # We define upstream file regex so we can be notifed of new upstream software release
-UFILES_REGEX ?= commons-$(GARNAME)-(\d+(?:\.\d+)*)-bin.tar.gz
+UFILES_REGEX ?= commons-$(NAME)-(\d+(?:\.\d+)*)-bin.tar.gz
 USTREAM_MASTER_SITE ?= $(SPKG_SOURCEURL)
 
 # Includes the rest of gar

Modified: csw/mgar/gar/v2-fortran/categories/kde/category.mk
===================================================================
--- csw/mgar/gar/v2-fortran/categories/kde/category.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/categories/kde/category.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -6,8 +6,8 @@
 KDE_MIRROR    = $(KDE_ROOT)/$(KDE_DIST)/$(KDE_VERSION)/src/
 
 MASTER_SITES ?= $(KDE_MIRROR)
-GARVERSION   ?= $(KDE_VERSION)
-PKGDIST      ?= $(GARNAME)-$(GARVERSION).tar.bz2
+VERSION   ?= $(KDE_VERSION)
+PKGDIST      ?= $(NAME)-$(VERSION).tar.bz2
 DISTFILES    += $(PKGDIST)
 
 # Compiler

Modified: csw/mgar/gar/v2-fortran/categories/kde4/category.mk
===================================================================
--- csw/mgar/gar/v2-fortran/categories/kde4/category.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/categories/kde4/category.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -6,8 +6,8 @@
 KDE_MIRROR    = $(KDE_ROOT)/$(KDE_DIST)/$(KDE_VERSION)/src/
 
 MASTER_SITES ?= $(KDE_MIRROR)
-GARVERSION   ?= $(KDE_VERSION)
-PKGDIST      ?= $(DISTNAME)-$(GARVERSION).tar.bz2
+VERSION   ?= $(KDE_VERSION)
+PKGDIST      ?= $(DISTNAME)-$(VERSION).tar.bz2
 DISTFILES    += $(PKGDIST)
 
 # Compiler

Modified: csw/mgar/gar/v2-fortran/categories/rbgems/category.mk
===================================================================
--- csw/mgar/gar/v2-fortran/categories/rbgems/category.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/categories/rbgems/category.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -4,13 +4,13 @@
 MASTER_SITES ?= http://rubygems.org/downloads/
 
 # This is common to most modules - override in module makefile if different
-GEMNAME ?= $(GARNAME)
-GEMVERSION ?= $(GARVERSION)
+GEMNAME ?= $(NAME)
+GEMVERSION ?= $(VERSION)
 GEMFILE   ?= $(GEMNAME)-$(GEMVERSION).gem
 DISTFILES += $(GEMFILE)
 
 GEMPKGNAME ?= $(GEMNAME)
-GEMCATALOGNAME ?= $(GEMPKGNAME)
+GEMCATALOGNAME ?= $(subst -,_,$(GEMPKGNAME))
 
 # PACKAGES ?= CSWgem-$(GEMPKGNAME) CSWgem-$(GEMPKGNAME)-doc
 PACKAGES ?= CSWgem-$(GEMPKGNAME)
@@ -86,5 +86,5 @@
 # Check for a CPAN module version update
 update-check:
 	@# TBD!
-	@echo " ==> Update Check: $(GARNAME) $(GARVERSION)"
-	@echo " ==> AUTO UPDATE CHECK FOR $(GARNAME) IS DISABLED"
+	@echo " ==> Update Check: $(NAME) $(VERSION)"
+	@echo " ==> AUTO UPDATE CHECK FOR $(NAME) IS DISABLED"

Modified: csw/mgar/gar/v2-fortran/categories/xfce/category.mk
===================================================================
--- csw/mgar/gar/v2-fortran/categories/xfce/category.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/categories/xfce/category.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -4,8 +4,8 @@
 XFCE_MIRROR     = $(XFCE_ROOT)/archive/xfce-$(XFCE_VERSION)/src/
 
 MASTER_SITES   ?= $(XFCE_MIRROR)
-GARVERSION     ?= $(XFCE_VERSION)
-PKGDIST        ?= $(GARNAME)-$(GARVERSION).tar.bz2
+VERSION     ?= $(XFCE_VERSION)
+PKGDIST        ?= $(NAME)-$(VERSION).tar.bz2
 DISTFILES      += $(PKGDIST)
 
 # Compiler options

Modified: csw/mgar/gar/v2-fortran/etc/commondirs-i386
===================================================================
--- csw/mgar/gar/v2-fortran/etc/commondirs-i386	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/etc/commondirs-i386	2011-01-12 19:26:11 UTC (rev 12515)
@@ -42,28 +42,22 @@
 /opt/csw/share/locale/be/LC_MESSAGES
 /opt/csw/share/locale/bg
 /opt/csw/share/locale/bg/LC_MESSAGES
-/opt/csw/share/locale/bg/LC_TIME
 /opt/csw/share/locale/ca
 /opt/csw/share/locale/ca/LC_MESSAGES
 /opt/csw/share/locale/cs
 /opt/csw/share/locale/cs/LC_MESSAGES
-/opt/csw/share/locale/cs/LC_TIME
 /opt/csw/share/locale/da
 /opt/csw/share/locale/da/LC_MESSAGES
-/opt/csw/share/locale/da/LC_TIME
 /opt/csw/share/locale/de
 /opt/csw/share/locale/de/LC_MESSAGES
-/opt/csw/share/locale/de/LC_TIME
 /opt/csw/share/locale/el
 /opt/csw/share/locale/el/LC_MESSAGES
-/opt/csw/share/locale/el/LC_TIME
 /opt/csw/share/locale/en at boldquot
 /opt/csw/share/locale/en at boldquot/LC_MESSAGES
 /opt/csw/share/locale/en at quot
 /opt/csw/share/locale/en at quot/LC_MESSAGES
 /opt/csw/share/locale/es
 /opt/csw/share/locale/es/LC_MESSAGES
-/opt/csw/share/locale/es/LC_TIME
 /opt/csw/share/locale/et
 /opt/csw/share/locale/et/LC_MESSAGES
 /opt/csw/share/locale/eu
@@ -72,12 +66,10 @@
 /opt/csw/share/locale/fi/LC_MESSAGES
 /opt/csw/share/locale/fr
 /opt/csw/share/locale/fr/LC_MESSAGES
-/opt/csw/share/locale/fr/LC_TIME
 /opt/csw/share/locale/ga
 /opt/csw/share/locale/ga/LC_MESSAGES
 /opt/csw/share/locale/gl
 /opt/csw/share/locale/gl/LC_MESSAGES
-/opt/csw/share/locale/gl/LC_TIME
 /opt/csw/share/locale/he
 /opt/csw/share/locale/he/LC_MESSAGES
 /opt/csw/share/locale/hr
@@ -88,50 +80,38 @@
 /opt/csw/share/locale/id/LC_MESSAGES
 /opt/csw/share/locale/it
 /opt/csw/share/locale/it/LC_MESSAGES
-/opt/csw/share/locale/it/LC_TIME
 /opt/csw/share/locale/ja
 /opt/csw/share/locale/ja/LC_MESSAGES
-/opt/csw/share/locale/ja/LC_TIME
 /opt/csw/share/locale/ko
 /opt/csw/share/locale/ko/LC_MESSAGES
-/opt/csw/share/locale/ko/LC_TIME
 /opt/csw/share/locale/lt
 /opt/csw/share/locale/lt/LC_MESSAGES
 /opt/csw/share/locale/nl
 /opt/csw/share/locale/nl/LC_MESSAGES
-/opt/csw/share/locale/nl/LC_TIME
 /opt/csw/share/locale/nn
 /opt/csw/share/locale/nn/LC_MESSAGES
 /opt/csw/share/locale/no
 /opt/csw/share/locale/no/LC_MESSAGES
-/opt/csw/share/locale/no/LC_TIME
 /opt/csw/share/locale/pl
 /opt/csw/share/locale/pl/LC_MESSAGES
-/opt/csw/share/locale/pl/LC_TIME
 /opt/csw/share/locale/pt
 /opt/csw/share/locale/pt/LC_MESSAGES
-/opt/csw/share/locale/pt/LC_TIME
 /opt/csw/share/locale/pt_BR
 /opt/csw/share/locale/pt_BR/LC_MESSAGES
-/opt/csw/share/locale/pt_BR/LC_TIME
 /opt/csw/share/locale/ro
 /opt/csw/share/locale/ro/LC_MESSAGES
 /opt/csw/share/locale/ru
 /opt/csw/share/locale/ru/LC_MESSAGES
-/opt/csw/share/locale/ru/LC_TIME
 /opt/csw/share/locale/sk
 /opt/csw/share/locale/sk/LC_MESSAGES
-/opt/csw/share/locale/sk/LC_TIME
 /opt/csw/share/locale/sl
 /opt/csw/share/locale/sl/LC_MESSAGES
-/opt/csw/share/locale/sl/LC_TIME
 /opt/csw/share/locale/sp
 /opt/csw/share/locale/sp/LC_MESSAGES
 /opt/csw/share/locale/sr
 /opt/csw/share/locale/sr/LC_MESSAGES
 /opt/csw/share/locale/sv
 /opt/csw/share/locale/sv/LC_MESSAGES
-/opt/csw/share/locale/sv/LC_TIME
 /opt/csw/share/locale/tr
 /opt/csw/share/locale/tr/LC_MESSAGES
 /opt/csw/share/locale/uk
@@ -142,7 +122,6 @@
 /opt/csw/share/locale/wa/LC_MESSAGES
 /opt/csw/share/locale/zh
 /opt/csw/share/locale/zh/LC_MESSAGES
-/opt/csw/share/locale/zh/LC_TIME
 /opt/csw/share/locale/zh_CN
 /opt/csw/share/locale/zh_CN.GB2312
 /opt/csw/share/locale/zh_CN.GB2312/LC_MESSAGES

Modified: csw/mgar/gar/v2-fortran/etc/commondirs-sparc
===================================================================
--- csw/mgar/gar/v2-fortran/etc/commondirs-sparc	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/etc/commondirs-sparc	2011-01-12 19:26:11 UTC (rev 12515)
@@ -46,28 +46,22 @@
 /opt/csw/share/locale/be/LC_MESSAGES
 /opt/csw/share/locale/bg
 /opt/csw/share/locale/bg/LC_MESSAGES
-/opt/csw/share/locale/bg/LC_TIME
 /opt/csw/share/locale/ca
 /opt/csw/share/locale/ca/LC_MESSAGES
 /opt/csw/share/locale/cs
 /opt/csw/share/locale/cs/LC_MESSAGES
-/opt/csw/share/locale/cs/LC_TIME
 /opt/csw/share/locale/da
 /opt/csw/share/locale/da/LC_MESSAGES
-/opt/csw/share/locale/da/LC_TIME
 /opt/csw/share/locale/de
 /opt/csw/share/locale/de/LC_MESSAGES
-/opt/csw/share/locale/de/LC_TIME
 /opt/csw/share/locale/el
 /opt/csw/share/locale/el/LC_MESSAGES
-/opt/csw/share/locale/el/LC_TIME
 /opt/csw/share/locale/en at boldquot
 /opt/csw/share/locale/en at boldquot/LC_MESSAGES
 /opt/csw/share/locale/en at quot
 /opt/csw/share/locale/en at quot/LC_MESSAGES
 /opt/csw/share/locale/es
 /opt/csw/share/locale/es/LC_MESSAGES
-/opt/csw/share/locale/es/LC_TIME
 /opt/csw/share/locale/et
 /opt/csw/share/locale/et/LC_MESSAGES
 /opt/csw/share/locale/eu
@@ -76,12 +70,10 @@
 /opt/csw/share/locale/fi/LC_MESSAGES
 /opt/csw/share/locale/fr
 /opt/csw/share/locale/fr/LC_MESSAGES
-/opt/csw/share/locale/fr/LC_TIME
 /opt/csw/share/locale/ga
 /opt/csw/share/locale/ga/LC_MESSAGES
 /opt/csw/share/locale/gl
 /opt/csw/share/locale/gl/LC_MESSAGES
-/opt/csw/share/locale/gl/LC_TIME
 /opt/csw/share/locale/he
 /opt/csw/share/locale/he/LC_MESSAGES
 /opt/csw/share/locale/hr
@@ -92,50 +84,38 @@
 /opt/csw/share/locale/id/LC_MESSAGES
 /opt/csw/share/locale/it
 /opt/csw/share/locale/it/LC_MESSAGES
-/opt/csw/share/locale/it/LC_TIME
 /opt/csw/share/locale/ja
 /opt/csw/share/locale/ja/LC_MESSAGES
-/opt/csw/share/locale/ja/LC_TIME
 /opt/csw/share/locale/ko
 /opt/csw/share/locale/ko/LC_MESSAGES
-/opt/csw/share/locale/ko/LC_TIME
 /opt/csw/share/locale/lt
 /opt/csw/share/locale/lt/LC_MESSAGES
 /opt/csw/share/locale/nl
 /opt/csw/share/locale/nl/LC_MESSAGES
-/opt/csw/share/locale/nl/LC_TIME
 /opt/csw/share/locale/nn
 /opt/csw/share/locale/nn/LC_MESSAGES
 /opt/csw/share/locale/no
 /opt/csw/share/locale/no/LC_MESSAGES
-/opt/csw/share/locale/no/LC_TIME
 /opt/csw/share/locale/pl
 /opt/csw/share/locale/pl/LC_MESSAGES
-/opt/csw/share/locale/pl/LC_TIME
 /opt/csw/share/locale/pt
 /opt/csw/share/locale/pt/LC_MESSAGES
-/opt/csw/share/locale/pt/LC_TIME
 /opt/csw/share/locale/pt_BR
 /opt/csw/share/locale/pt_BR/LC_MESSAGES
-/opt/csw/share/locale/pt_BR/LC_TIME
 /opt/csw/share/locale/ro
 /opt/csw/share/locale/ro/LC_MESSAGES
 /opt/csw/share/locale/ru
 /opt/csw/share/locale/ru/LC_MESSAGES
-/opt/csw/share/locale/ru/LC_TIME
 /opt/csw/share/locale/sk
 /opt/csw/share/locale/sk/LC_MESSAGES
-/opt/csw/share/locale/sk/LC_TIME
 /opt/csw/share/locale/sl
 /opt/csw/share/locale/sl/LC_MESSAGES
-/opt/csw/share/locale/sl/LC_TIME
 /opt/csw/share/locale/sp
 /opt/csw/share/locale/sp/LC_MESSAGES
 /opt/csw/share/locale/sr
 /opt/csw/share/locale/sr/LC_MESSAGES
 /opt/csw/share/locale/sv
 /opt/csw/share/locale/sv/LC_MESSAGES
-/opt/csw/share/locale/sv/LC_TIME
 /opt/csw/share/locale/tr
 /opt/csw/share/locale/tr/LC_MESSAGES
 /opt/csw/share/locale/uk
@@ -146,7 +126,6 @@
 /opt/csw/share/locale/wa/LC_MESSAGES
 /opt/csw/share/locale/zh
 /opt/csw/share/locale/zh/LC_MESSAGES
-/opt/csw/share/locale/zh/LC_TIME
 /opt/csw/share/locale/zh_CN
 /opt/csw/share/locale/zh_CN.GB2312
 /opt/csw/share/locale/zh_CN.GB2312/LC_MESSAGES

Modified: csw/mgar/gar/v2-fortran/gar.conf.mk
===================================================================
--- csw/mgar/gar/v2-fortran/gar.conf.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/gar.conf.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -631,13 +631,13 @@
 #
 
 # Gnome
-GNOME_PROJ  ?= $(GARNAME)
+GNOME_PROJ  ?= $(NAME)
 GNOME_ROOT   = http://ftp.gnome.org/pub/GNOME/sources
-GNOME_SUBV   = $(shell echo $(GARVERSION) | awk -F. '{print $$1"."$$2}')
+GNOME_SUBV   = $(shell echo $(VERSION) | awk -F. '{print $$1"."$$2}')
 GNOME_MIRROR = $(GNOME_ROOT)/$(GNOME_PROJ)/$(GNOME_SUBV)/
 
 # SourceForge
-SF_PROJ     ?= $(GARNAME)
+SF_PROJ     ?= $(NAME)
 SF_MIRRORS  ?= http://downloads.sourceforge.net/$(SF_PROJ)/
 # Keep this for compatibility
 SF_MIRROR    = $(firstword $(SF_MIRRORS))
@@ -645,18 +645,18 @@
 UPSTREAM_USE_SF	?= 0
 
 # Google Code
-GOOGLE_PROJECT ?= $(GARNAME)
+GOOGLE_PROJECT ?= $(NAME)
 GOOGLE_MIRROR  ?= http://$(GOOGLE_PROJECT).googlecode.com/files/
 
 # Berlios
-BERLIOS_PROJECT ?= $(GARNAME)
+BERLIOS_PROJECT ?= $(NAME)
 BERLIOS_MIRROR ?= http://download.berlios.de/$(BERLIOS_PROJECT)/ http://download2.berlios.de/$(BERLIOS_PROJECT)/
 
 # GNU
 GNU_SITE     = http://mirrors.kernel.org
 GNU_GNUROOT  = $(GNU_SITE)/gnu
 GNU_NGNUROOT = $(GNU_SITE)/non-gnu
-GNU_PROJ    ?= $(GARNAME)
+GNU_PROJ    ?= $(NAME)
 GNU_MIRROR   = $(GNU_GNUROOT)/$(GNU_PROJ)/
 GNU_NMIRROR  = $(GNU_NGNUROOT)/$(GNU_PROJ)/
 
@@ -671,7 +671,7 @@
 CPAN_FIRST_MIRROR = $(firstword $(CPAN_SITES))/authors/id
 
 # Python Package Index
-PYPI_PROJECT ?= $(GARNAME)
+PYPI_PROJECT ?= $(NAME)
 PYPI_SUBDIR = $(shell echo $(PYPI_PROJECT) | cut -c 1)
 PYPI_MIRROR = http://pypi.python.org/packages/source/$(PYPI_SUBDIR)/$(PYPI_PROJECT)/
 

Modified: csw/mgar/gar/v2-fortran/gar.lib.mk
===================================================================
--- csw/mgar/gar/v2-fortran/gar.lib.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/gar.lib.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -200,7 +200,7 @@
 			else \
 				if echo $(DISTFILES) | grep -w $$FILE >/dev/null; then \
 					PACKAGE_UP_TO_DATE=1; \
-					echo "$(GARNAME) : Package is up-to-date. Current version is $$FILE" ; \
+					echo "$(NAME) : Package is up-to-date. Current version is $$FILE" ; \
 				else \
 					NEW_FILES="$$FILE $$NEW_FILES"; \
 				fi; \
@@ -209,11 +209,11 @@
 		done; \
 		if test -z "$$NEW_FILES" ; then \
 			if [ ! -n '$(UFILES_REGEX)' ]; then \
-				echo "$(GARNAME) : Warning UFILES_REGEX is not set : $(UFILES_REGEX)" ; \
+				echo "$(NAME) : Warning UFILES_REGEX is not set : $(UFILES_REGEX)" ; \
 #				{ echo ""; \
-#				  echo "Hello dear $(GARNAME) maintainer,"; \
+#				  echo "Hello dear $(NAME) maintainer,"; \
 #				  echo ""; \
-#				  echo "The upstream notification job has detected that $(GARNAME) is not configured for automatic upstream file update detection."; \
+#				  echo "The upstream notification job has detected that $(NAME) is not configured for automatic upstream file update detection."; \
 #				  echo ""; \
 #				  echo "Please consider updating your package. Documentation is available from this link : http://www.opencsw.org" ; \
 #				  echo ""; \
@@ -221,22 +221,22 @@
 #				  echo ""; \
 #				  echo "--"; \
 #				  echo "Kindest regards"; \
-#				  echo "upstream notification job"; } | $(GARBIN)/mail2maintainer -s '[svn] $(GARNAME) upstream update notification' $(GARNAME); \
+#				  echo "upstream notification job"; } | $(GARBIN)/mail2maintainer -s '[svn] $(NAME) upstream update notification' $(NAME); \
 			else \
 				if [ "$$PACKAGE_UP_TO_DATE" -eq "0" ]; then \
-					echo "$(GARNAME) : Warning no files to check ! $(FILES2CHECK)" ; \
-					echo "$(GARNAME) :     UPSTREAM_MASTER_SITES is $(UPSTREAM_MASTER_SITES)" ; \
-					echo "$(GARNAME) :     DISTNAME is $(DISTNAME)" ; \
-					echo "$(GARNAME) :     UFILES_REGEX is : $(UFILES_REGEX)" ; \
-					echo "$(GARNAME) : Please check configuration" ; \
+					echo "$(NAME) : Warning no files to check ! $(FILES2CHECK)" ; \
+					echo "$(NAME) :     UPSTREAM_MASTER_SITES is $(UPSTREAM_MASTER_SITES)" ; \
+					echo "$(NAME) :     DISTNAME is $(DISTNAME)" ; \
+					echo "$(NAME) :     UFILES_REGEX is : $(UFILES_REGEX)" ; \
+					echo "$(NAME) : Please check configuration" ; \
 				fi; \
 			fi; \
 		else \
-			echo "$(GARNAME) : new upstream files available: $$NEW_FILES"; \
+			echo "$(NAME) : new upstream files available: $$NEW_FILES"; \
 			{	echo ""; \
-				echo "Hello dear $(GARNAME) maintainer,"; \
+				echo "Hello dear $(NAME) maintainer,"; \
 				echo ""; \
-				echo "The upstream notification job has detected the availability of new files for $(GARNAME)."; \
+				echo "The upstream notification job has detected the availability of new files for $(NAME)."; \
 				echo ""; \
 				echo "The following upstream file(s):"; \
 				echo "    $$NEW_FILES"; \
@@ -250,7 +250,7 @@
 				echo ""; \
 				echo "--"; \
 				echo "Kindest regards"; \
-				echo "upstream notification job"; } | $(GARBIN)/mail2maintainer -s '[svn] $(GARNAME) upstream update notification' $(GARNAME); \
+				echo "upstream notification job"; } | $(GARBIN)/mail2maintainer -s '[svn] $(NAME) upstream update notification' $(NAME); \
 		fi; \
 	fi
 
@@ -266,7 +266,7 @@
 			else \
 				if echo $(DISTFILES) | grep -w $$FILE >/dev/null; then \
 					PACKAGE_UP_TO_DATE=1; \
-					echo "$(GARNAME) : Package is up-to-date. Current version is $$FILE" ; \
+					echo "$(NAME) : Package is up-to-date. Current version is $$FILE" ; \
 				else \
 					NEW_FILES="$$FILE $$NEW_FILES"; \
 				fi; \
@@ -275,18 +275,18 @@
 		done; \
 		if test -z "$$NEW_FILES" ; then \
 			if [ ! -n '$(UFILES_REGEX)' ]; then \
-				echo "$(GARNAME) : Warning UFILES_REGEX is not set : $(UFILES_REGEX)" ; \
+				echo "$(NAME) : Warning UFILES_REGEX is not set : $(UFILES_REGEX)" ; \
 			else \
 				if [ "$$PACKAGE_UP_TO_DATE" -eq "0" ]; then \
-					echo "$(GARNAME) : Warning no files to check ! $(FILES2CHECK)" ; \
-					echo "$(GARNAME) :     UPSTREAM_MASTER_SITES is $(UPSTREAM_MASTER_SITES)" ; \
-					echo "$(GARNAME) :     DISTNAME is $(DISTNAME)" ; \
-					echo "$(GARNAME) :     UFILES_REGEX is : $(UFILES_REGEX)" ; \
-					echo "$(GARNAME) : Please check configuration" ; \
+					echo "$(NAME) : Warning no files to check ! $(FILES2CHECK)" ; \
+					echo "$(NAME) :     UPSTREAM_MASTER_SITES is $(UPSTREAM_MASTER_SITES)" ; \
+					echo "$(NAME) :     DISTNAME is $(DISTNAME)" ; \
+					echo "$(NAME) :     UFILES_REGEX is : $(UFILES_REGEX)" ; \
+					echo "$(NAME) : Please check configuration" ; \
 				fi; \
 			fi; \
 		else \
-			echo "$(GARNAME) : new upstream files available: $$NEW_FILES"; \
+			echo "$(NAME) : new upstream files available: $$NEW_FILES"; \
 		fi; \
 	fi
 	
@@ -380,7 +380,7 @@
 # to supply an alternate target at their discretion
 git-extract-%:
 	@echo " ===> Extracting Git Repo $(DOWNLOADDIR)/$* (Treeish: $(call GIT_TREEISH,$*))"
-	git --bare archive --prefix=$(GARNAME)-$(GARVERSION)/ --remote=file://$(abspath $(DOWNLOADDIR))/$*/ $(call GIT_TREEISH,$*) | gtar -xf - -C $(EXTRACTDIR)
+	git --bare archive --prefix=$(NAME)-$(VERSION)/ --remote=file://$(abspath $(DOWNLOADDIR))/$*/ $(call GIT_TREEISH,$*) | gtar -xf - -C $(EXTRACTDIR)
 	@$(MAKECOOKIE)
 
 # rule to extract files with unzip
@@ -768,8 +768,8 @@
 
 # pkg-config scripts
 install-%-config:
-	mkdir -p $(STAGINGDIR)/$(GARNAME)
-	cp -f $(DESTDIR)$(bindir)/$*-config $(STAGINGDIR)/$(GARNAME)/
+	mkdir -p $(STAGINGDIR)/$(NAME)
+	cp -f $(DESTDIR)$(bindir)/$*-config $(STAGINGDIR)/$(NAME)/
 	$(MAKECOOKIE)
 
 ######################################

Modified: csw/mgar/gar/v2-fortran/gar.mk
===================================================================
--- csw/mgar/gar/v2-fortran/gar.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/gar.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -15,6 +15,9 @@
 $(error Your version of 'make' is too old: $(MAKE_VERSION). Please make sure you are using at least 3.81)
 endif
 
+$(if $(GARNAME),$(error The deprecated variable 'GARNAME' is defined, please replace it with 'NAME'))
+$(if $(GARVERSION),$(error The deprecated variable 'GARVERSION' is defined, please replace it with 'VERSION'))
+
 # $(GARDIR) is pre-set by the top-level category.mk
 GARBIN  = $(GARDIR)/bin
 
@@ -33,7 +36,7 @@
 PARALLELMFLAGS ?= $(MFLAGS)
 export PARALLELMFLAGS
 
-DISTNAME ?= $(GARNAME)-$(GARVERSION)
+DISTNAME ?= $(NAME)-$(VERSION)
 
 DYNSCRIPTS = $(foreach PKG,$(SPKG_SPECS),$(foreach SCR,$(ADMSCRIPTS),$(if $(value $(PKG)_$(SCR)), $(PKG).$(SCR))))
 _LOCALFILES = $(notdir $(wildcard files/*))
@@ -57,9 +60,9 @@
 # For rules that do nothing, display what dependencies they
 # successfully completed
 #DONADA = @echo "	[$@] complete.  Finished rules: $+"
-#DONADA = @touch $(COOKIEDIR)/$@; echo "	[$@] complete for $(GARNAME)."
+#DONADA = @touch $(COOKIEDIR)/$@; echo "	[$@] complete for $(NAME)."
 COOKIEFILE = $(COOKIEDIR)/$(patsubst $(COOKIEDIR)/%,%,$1)
-DONADA = @touch $(call COOKIEFILE,$@); echo "	[$@] complete for $(GARNAME)."
+DONADA = @touch $(call COOKIEFILE,$@); echo "	[$@] complete for $(NAME)."
 
 
 # TODO: write a stub rule to print out the name of a rule when it
@@ -425,8 +428,8 @@
 	@( if [ -d "$(WORKSRC)" ]; then \
 		echo ' ==> Snapshotting extracted source tree with git'; \
 		cd $(WORKSRC); git init; git add .; \
-		git commit -m "Upstream $(GARVERSION)"; \
-		git tag -am "Upstream $(GARVERSION)" upstream-$(GARVERSION); \
+		git commit -m "Upstream $(VERSION)"; \
+		git tag -am "Upstream $(VERSION)" upstream-$(VERSION); \
 		git checkout -b csw; \
 	   fi )
 	@$(MAKECOOKIE)
@@ -441,7 +444,7 @@
 _var_definitions = $(foreach VAR,$(shell perl -ne 'print "$$1 " if( /@([^@]+)@/ )' <$1),$(VAR)=$($(VAR)))
 
 expandvars-%:
-	$(call _var_definitions,$(WORKDIR)/$*) perl -i-unexpanded -npe 's/@([^@]+)@/$$ENV{$$1}/e' $(WORKDIR)/$*
+	$(call _var_definitions,$(WORKDIR)/$*) perl -i-unexpanded -npe 's/@([^@]+)@/$$ENV{$$1}/eg' $(WORKDIR)/$*
 	@$(MAKECOOKIE)
 
 
@@ -471,7 +474,7 @@
 	@( if [ -d "$(WORKSRC)/.git" ]; then \
 		echo "Tagging top of current csw patch stack..."; \
 		cd $(WORKSRC); \
-		git tag -am "CSW $(GARVERSION)" csw-$(GARVERSION); \
+		git tag -am "CSW $(VERSION)" csw-$(VERSION); \
 	  fi )
 	@$(MAKECOOKIE)
 
@@ -495,7 +498,7 @@
 			echo "Capturing changes..."; \
 			git commit $(GIT_COMMIT_OPTS) && \
 			( NEXTPATCH=`git log --pretty=oneline master..HEAD | wc -l | tr -d '[[:space:]]'`; \
-			git format-patch --start-number=$$NEXTPATCH csw-$(GARVERSION); \
+			git format-patch --start-number=$$NEXTPATCH csw-$(VERSION); \
 			echo Add the following to your recipe and then; \
 			NEWPATCHES=`echo 00*-*patch`; \
 			FILES_PATCHES=`for p in $$NEWPATCHES; do echo files/$$p; done`; \

Modified: csw/mgar/gar/v2-fortran/gar.pkg.mk
===================================================================
--- csw/mgar/gar/v2-fortran/gar.pkg.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/gar.pkg.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -22,8 +22,8 @@
 PKGINFO ?= /usr/bin/pkginfo
 
 # You can use either PACKAGES with dynamic gspec-files or explicitly add gspec-files to DISTFILES.
-# Do "PACKAGES = CSWmypkg" when you build a package whose GARNAME is not the package name.
-# If no explicit gspec-files have been defined the default name for the package is CSW$(GARNAME).
+# Do "PACKAGES = CSWmypkg" when you build a package whose NAME is not the package name.
+# If no explicit gspec-files have been defined the default name for the package is CSW$(NAME).
 # The whole processing is done from _PKG_SPECS, which includes all packages to be build.
 
 # SRCPACKAGE_BASE is the name of the package containing the sourcefiles for all packages
@@ -31,8 +31,8 @@
 # SRCPACKAGE is the name of the package containing the sources
 
 ifeq ($(origin PACKAGES), undefined)
-PACKAGES        = $(if $(filter %.gspec,$(DISTFILES)),,CSW$(GARNAME))
-CATALOGNAME    ?= $(if $(filter %.gspec,$(DISTFILES)),,$(GARNAME))
+PACKAGES        = $(if $(filter %.gspec,$(DISTFILES)),,CSW$(NAME))
+CATALOGNAME    ?= $(if $(filter %.gspec,$(DISTFILES)),,$(NAME))
 SRCPACKAGE_BASE = $(firstword $(basename $(filter %.gspec,$(DISTFILES))) $(PACKAGES))
 SRCPACKAGE     ?= $(SRCPACKAGE_BASE)-src
 SPKG_SPECS     ?= $(basename $(filter %.gspec,$(DISTFILES))) $(PACKAGES) $(if $(NOSOURCEPACKAGE),,$(SRCPACKAGE))
@@ -52,6 +52,7 @@
 GARPKG_v1 = CSWgar-v1
 GARPKG_v2 = CSWgar-v2
 RUNTIME_DEP_PKGS_$(SRCPACKAGE) ?= $(or $(GARPKG_$(GARSYSTEMVERSION)),$(error GAR version $(GARSYSTEMVERSION) unknown))
+CATALOG_RELEASE ?= current
 
 _PKG_SPECS      = $(filter-out $(NOPACKAGE),$(SPKG_SPECS))
 
@@ -78,7 +79,7 @@
 
 _PKG_SPECS      = $(filter-out $(NOPACKAGE),$(SPKG_SPECS))
 
-BUNDLE ?= $(GARNAME)
+BUNDLE ?= $(NAME)
 
 # pkgname - Get the name of a package from a gspec-name or package-name
 #
@@ -150,7 +151,7 @@
 
 
 SPKG_DESC      ?= $(DESCRIPTION)
-SPKG_VERSION   ?= $(GARVERSION)
+SPKG_VERSION   ?= $(VERSION)
 SPKG_CATEGORY  ?= application
 SPKG_SOURCEURL ?= $(firstword $(VENDOR_URL) \
 			$(if $(filter $(GNU_MIRROR),$(MASTER_SITES)),http://www.gnu.org/software/$(GNU_PROJ)) \
@@ -194,6 +195,7 @@
 
 # - set class for all config files
 _CSWCLASS_FILTER = | perl -ane '\
+		$(foreach FILE,$(CPTEMPLATES),$$F[1] = "cswcptemplates" if( $$F[2] =~ m(^$(FILE)$$) );)\
 		$(foreach FILE,$(MIGRATECONF),$$F[1] = "cswmigrateconf" if( $$F[2] =~ m(^$(FILE)$$) );)\
 		$(foreach FILE,$(SAMPLECONF:%\.CSW=%),$$F[1] = "cswcpsampleconf" if ( $$F[2] =~ m(^$(FILE)\.CSW$$) );)\
 		$(foreach FILE,$(PRESERVECONF:%\.CSW=%),$$F[1] = "cswpreserveconf" if( $$F[2] =~ m(^$(FILE)\.CSW$$) );)\
@@ -248,7 +250,7 @@
 # Where we find our mkpackage global templates
 PKGLIB = $(GARDIR)/pkglib
 
-PKG_EXPORTS  = GARNAME GARVERSION DESCRIPTION CATEGORIES GARCH GARDIR GARBIN
+PKG_EXPORTS  = NAME VERSION DESCRIPTION CATEGORIES GARCH GARDIR GARBIN
 PKG_EXPORTS += CURDIR WORKDIR WORKDIR_FIRSTMOD WORKSRC WORKSRC_FIRSTMOD PKGROOT
 PKG_EXPORTS += SPKG_REVSTAMP SPKG_PKGNAME SPKG_DESC SPKG_VERSION SPKG_CATEGORY
 PKG_EXPORTS += SPKG_VENDOR SPKG_EMAIL SPKG_PSTAMP SPKG_BASEDIR SPKG_CLASSES
@@ -431,9 +433,9 @@
 	               ) \
 	              <$(PROTOTYPE); \
 	   if [ -n "$(EXTRA_PKGFILES_$*)" ]; then echo "$(EXTRA_PKGFILES_$*)"; fi \
-	  ) $(call checkpkg_override_filter,$*) $(_CSWCLASS_FILTER) $(_PROTOTYPE_MODIFIERS) $(_PROTOTYPE_FILTER_$*) >$@; \
+	  ) $(call checkpkg_override_filter,$*) $(_CSWCLASS_FILTER) $(_CATEGORY_FILTER) $(_PROTOTYPE_MODIFIERS) $(_PROTOTYPE_FILTER_$*) >$@; \
 	else \
-	  cat $(PROTOTYPE) $(call checkpkg_override_filter,$*) $(_CSWCLASS_FILTER) $(_PROTOTYPE_MODIFIERS) $(_PROTOTYPE_FILTER_$*) >$@; \
+	  cat $(PROTOTYPE) $(call checkpkg_override_filter,$*) $(_CSWCLASS_FILTER) $(_CATEGORY_FILTER) $(_PROTOTYPE_MODIFIERS) $(_PROTOTYPE_FILTER_$*) >$@; \
 	fi
 	$(if $(ALLOW_RELOCATE),$(call dontrelocate,opt,$(PROTOTYPE)))
 
@@ -463,7 +465,7 @@
 $(WORKDIR)/%.depend: $(WORKDIR)/$*.prototype
 $(WORKDIR)/%.depend: _EXTRA_GAR_PKGS += $(_CATEGORY_RUNTIME_DEP_PKGS)
 $(WORKDIR)/%.depend: _EXTRA_GAR_PKGS += $(if $(strip $(shell cat $(WORKDIR)/$*.prototype | perl -ane 'print "yes" if( $$F[1] eq "cswalternatives")')),CSWalternatives)
-$(WORKDIR)/%.depend: _EXTRA_GAR_PKGS += $(if $(strip $(shell cat $(WORKDIR)/$*.prototype | perl -ane '$(foreach C,$(_CSWCLASSES),print "$C\n" if( $$F[1] eq "$C");)')),CSWcswclassutils)
+$(WORKDIR)/%.depend: _EXTRA_GAR_PKGS += $(foreach P,$(strip $(shell cat $(WORKDIR)/$*.prototype | perl -ane '$(foreach C,$(filter-out ugfiles,$(_CSWCLASSES)),print "$C " if( $$F[1] eq "$C");)')),CSWcas-$(subst csw,,$(P)))
 
 $(WORKDIR)/%.depend: _DEP_PKGS=$(or $(RUNTIME_DEP_PKGS_ONLY_$*),$(RUNTIME_DEP_PKGS_ONLY),$(sort $(_EXTRA_GAR_PKGS)) $(or $(RUNTIME_DEP_PKGS_$*),$(RUNTIME_DEP_PKGS),$(DEP_PKGS_$*),$(DEP_PKGS)))
 $(WORKDIR)/%.depend: $(WORKDIR)
@@ -484,7 +486,7 @@
 # Dynamic gspec-files are constructed as follows:
 # - Packages using dynamic gspec-files must be listed in PACKAGES
 # - There is a default of PACKAGES containing one packages named CSW
-#   followed by the GARNAME. It can be changed by setting PACKAGES explicitly.
+#   followed by the NAME. It can be changed by setting PACKAGES explicitly.
 # - The name of the generated package is always the same as listed in PACKAGES
 # - The catalog name defaults to the suffix following CSW of the package name,
 #   but can be customized by setting CATALOGNAME_<pkg> = <catalogname-of-pkg>
@@ -708,7 +710,6 @@
 
 merge-checkpkgoverrides-%:
 	@echo "[ Generating checkpkg override for package $* ]"
-	$(_DBG)ginstall -d $(PKGROOT)/opt/csw/share/checkpkg/overrides
 	$(_DBG)($(foreach O,$(or $(CHECKPKG_OVERRIDES_$*),$(CHECKPKG_OVERRIDES)) $(_CATEGORY_CHECKPKG_OVERRIDES),echo "$O";)) | \
 		perl -F'\|' -ane 'unshift @F,"$*"; $$F[0].=":"; print join(" ", at F );' \
 		> $(WORKDIR_GLOBAL)/checkpkg_override.$*
@@ -843,7 +844,11 @@
 # pkgcheck - check if the package is compliant
 #
 pkgcheck: $(foreach SPEC,$(_PKG_SPECS),package-$(SPEC))
-	$(_DBG)( LC_ALL=C $(GARBIN)/checkpkg $(foreach SPEC,$(_PKG_SPECS),$(SPKG_EXPORT)/`$(call _PKG_ENV,$(SPEC)) mkpackage --tmpdir $(SPKG_TMPDIR) -qs $(WORKDIR)/$(SPEC).gspec -D pkgfile`.gz ) || exit 2;)
+	$(_DBG)( LC_ALL=C $(GARBIN)/checkpkg \
+		--architecture "$(GARCH)" \
+		--os-releases "$(SPKG_OSNAME)" \
+		--catalog-release "$(CATALOG_RELEASE)" \
+		$(foreach SPEC,$(_PKG_SPECS),$(SPKG_EXPORT)/`$(call _PKG_ENV,$(SPEC)) mkpackage --tmpdir $(SPKG_TMPDIR) -qs $(WORKDIR)/$(SPEC).gspec -D pkgfile`.gz ) || exit 2;)
 	@$(MAKECOOKIE)
 
 pkgcheck-p:
@@ -930,13 +935,13 @@
 submitpkg-%:
 	@$(if $(filter $(call _REVISION),UNCOMMITTED NOTVERSIONED NOSVN),\
 		$(error You have local files not in the repository. Please commit everything before submitting a package))
-	$(SVN) -m "$(GARNAME): Tag as release $(SPKG_VERSION),$(SPKG_REVSTAMP)$(if $(filter default,$*),, for project '$*')" cp $(_PKGURL)/trunk $(_PKGURL)/tags/$(if $(filter default,$*),,$*_)$(GARNAME)-$(SPKG_VERSION),$(SPKG_REVSTAMP)
+	$(SVN) -m "$(NAME): Tag as release $(SPKG_VERSION),$(SPKG_REVSTAMP)$(if $(filter default,$*),, for project '$*')" cp $(_PKGURL)/trunk $(_PKGURL)/tags/$(if $(filter default,$*),,$*_)$(NAME)-$(SPKG_VERSION),$(SPKG_REVSTAMP)
 
 # dependb - update the dependency database
 #
 dependb:
 	@dependb --db $(SPKG_DEPEND_DB) \
-             --parent $(CATEGORIES)/$(GARNAME) \
+             --parent $(CATEGORIES)/$(NAME) \
              --add $(DEPENDS)
 
 # pkgenv - dump the packaging environment

Modified: csw/mgar/gar/v2-fortran/gar.svn.mk
===================================================================
--- csw/mgar/gar/v2-fortran/gar.svn.mk	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/gar.svn.mk	2011-01-12 19:26:11 UTC (rev 12515)
@@ -27,6 +27,6 @@
 	$(GARDIR)/bin/svnignore work cookies download
 
 scm-tag-release:
-	$(SVN) cp ../trunk ../tags/$(GARNAME)-$(GARVERSION)$(SPKG_REVSTAMP)
+	$(SVN) cp ../trunk ../tags/$(NAME)-$(VERSION)$(SPKG_REVSTAMP)
 
 .PHONY: scm-help scm-update-all scm-update-package scm-update-gar scm-update-ignores scm-tag-release

Modified: csw/mgar/gar/v2-fortran/lib/python/README
===================================================================
--- csw/mgar/gar/v2-fortran/lib/python/README	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/lib/python/README	2011-01-12 19:26:11 UTC (rev 12515)
@@ -1,29 +1,58 @@
-Python libraries, mostly related to checkpkg.
+This directory contains Python libraries, mostly related to checkpkg.
 
+==Checkpkg==
+
+Checks to implement:
+ - foo_bar != CSWfoo-bar -> error
+ - *dev(el)? -> error, suggest *-devel
+ - *-?rt -> error, suggest specific library packages
+ - empty package without 'transitional' in the name --> error, suggest
+ 	 'transitional'
+ - CSWpmfoo --> error, suggest CSWpm-foo
+ - Dependency on a transitional package --> error
+ 	 ('transitional', 'stub', 'legacy')
+ - Dependency on CSWcas-initsmf + rc* files --> error
+
+
 Development plan for checkpkg:
-
-- Move the 'data' field of the srv4_file table to a separate table (should
-	speed up checking if stats are already collected)
-- Store run history and display stats from each run
+- Generalize dependency checking by adding NeedFile(file_list, reason) to
+  error_mgr.  It's possible to need one of the listed files only, so files are
+  given as alternatives, but the reason is common.
+- Display stats from each run
 - Shorten the on-screen output, add commands to display override lines
 - Move the set check stats outside of checking functions, remove the special
-	status of dependency checking functions; add a progress bar for it.
+  status of dependency checking functions; add a progress bar for it.
 - Restructure the error reporting, group them by errors.
+- Sort all list data structures so that it's possible to diff the results of
+  pprint.pprint() and see meaningful results.  This will be the new
+  implementation for comparepkg.
+- Add fields to the srv4_file_stats table:
+  - source URL (for grouping by software)
+  - Description (to search for the word 'transitional')
+- Don't suggest two packages for the same soname.
 
+Also, see ticket list on trac: http://sourceforge.net/apps/trac/gar/report/1
 
+Items done:
+- Move the 'data' field of the srv4_file table to a separate table (should
+  speed up checking if stats are already collected)
+- Store run history
+
 Known problems:
 - libmagic fails sometimes when processing the whole catalog
+- hachoir_parser fails sometimes on i386 packages when examining them on sparc
 
+Package dependencies:
 
-Dependencies:
-
 It's possible to develop checkpkg on a non-Solaris platform, using unit
 tests as means to run various bits of code.  Here's the dependency list
 for Ubuntu.
 
-  python-cheetah
-  python-hachoir-parser
-  python-magic
-  python-mox
-  python-progressbar
+sudo aptitude install \
+  python-cheetah \
+  python-hachoir-parser \
+  python-magic \
+  python-mox \
+  python-progressbar \
+  python-sqlobject \
   python-yaml

Modified: csw/mgar/gar/v2-fortran/lib/python/catalog.py
===================================================================
--- csw/mgar/gar/v2-fortran/lib/python/catalog.py	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/lib/python/catalog.py	2011-01-12 19:26:11 UTC (rev 12515)
@@ -123,6 +123,7 @@
   def _GetCatalogData(self, fd):
     catalog_data = []
     for line in fd:
+      if line.startswith("#"): continue
       try:
         parsed = self._ParseCatalogLine(line)
         catalog_data.append(parsed)

Modified: csw/mgar/gar/v2-fortran/lib/python/checkpkg.py
===================================================================
--- csw/mgar/gar/v2-fortran/lib/python/checkpkg.py	2011-01-12 19:01:53 UTC (rev 12514)
+++ csw/mgar/gar/v2-fortran/lib/python/checkpkg.py	2011-01-12 19:26:11 UTC (rev 12515)
@@ -3,62 +3,33 @@
 # This is the checkpkg library, common for all checkpkg tests written in
 # Python.
 
-import copy
-import cPickle
-import errno
 import itertools
 import logging
-import operator
 import optparse
-import os
 import os.path
 import re
 import pprint
 import progressbar
-import socket
-import sqlite3
 import sqlobject
-import time
-from sqlobject import sqlbuilder
 import subprocess
-import textwrap
-from Cheetah import Template
 import database
 
-import package
 import inspective_package
-import package_checks
-import package_stats
 import models as m
-import configuration as c
-import tag
+import common_constants
+import package_stats
 
 
-DEBUG_BREAK_PKGMAP_AFTER = False
-SYSTEM_PKGMAP = "/var/sadm/install/contents"
-NEEDED_SONAMES = "needed sonames"
-RUNPATH = "runpath"
-SONAME = "soname"
-CONFIG_MTIME = "mtime"
-CONFIG_DB_SCHEMA = "db_schema_version"
-DO_NOT_REPORT_SURPLUS = set([u"CSWcommon", u"CSWcswclassutils", u"CSWisaexec"])
-DO_NOT_REPORT_MISSING = set([])
-DO_NOT_REPORT_MISSING_RE = [r"\*?SUNW.*"]
-DUMP_BIN = "/usr/ccs/bin/dump"
-PSTAMP_RE = r"(?P<username>\w+)@(?P<hostname>[\w\.-]+)-(?P<timestamp>\d+)"
 DESCRIPTION_RE = r"^([\S]+) - (.*)$"
 BAD_CONTENT_REGEXES = (
-    # Slightly obfuscating these by using the default concatenation of
-    # strings.
+    # Slightly obfuscating these by using concatenation of strings.
+    r'/export' r'/home',
     r'/export' r'/medusa',
     r'/opt' r'/build',
+    r'/usr' r'/local',
+    r'/usr' r'/share',
 )
 
-SYSTEM_SYMLINKS = (
-    ("/opt/csw/bdb4",     ["/opt/csw/bdb42"]),
-    ("/64",               ["/amd64", "/sparcv9"]),
-    ("/opt/csw/lib/i386", ["/opt/csw/lib"]),
-)
 INSTALL_CONTENTS_AVG_LINE_LENGTH = 102.09710677919261
 SYS_DEFAULT_RUNPATH = [
     "/usr/lib/$ISALIST",
@@ -67,80 +38,8 @@
     "/lib",
 ]
 
-CONTENT_PKG_RE = r"^\*?(CSW|SUNW)[0-9a-zA-Z\-]?[0-9a-z\-]+$"
-MD5_RE = r"^[0123456789abcdef]{32}$"
+MD5_RE = re.compile(r"^[0123456789abcdef]{32}$")
 
-REPORT_TMPL = u"""#if $missing_deps or $surplus_deps or $orphan_sonames
-Dependency issues of $pkgname:
-#end if
-#if $missing_deps
-#for $pkg, $reasons in $sorted($missing_deps)
-$pkg, reasons:
-#for $reason in $reasons
- - $reason
-#end for
-RUNTIME_DEP_PKGS_$pkgname += $pkg
-#end for
-#end if
-#if $surplus_deps
-If you don't know of any reasons to include these dependencies, you might remove them:
-#for $pkg in $sorted($surplus_deps)
-? $pkg
-#end for
-#end if
-"""
-
-SCREEN_ERROR_REPORT_TMPL = u"""#if $errors
-#if $debug
-ERROR: One or more errors have been found by $name.
-#end if
-#for $pkgname in $errors
-$pkgname:
-#for $error in $errors[$pkgname]
-#if $debug
-  $repr($error)
-#elif $error.msg
-$textwrap.fill($error.msg, 78, initial_indent="# ", subsequent_indent="# ")
-# -> $repr($error)
-
-#end if
-#end for
-#end for
-#else
-#if $debug
-OK: $repr($name) module found no problems.
-#end if
-#end if
-#if $messages
-#for $msg in $messages
-$textwrap.fill($msg, 78, initial_indent=" * ", subsequent_indent="   ")
-#end for
-#end if
-#if $gar_lines
-
-# Checkpkg suggests adding the following lines to the GAR recipe:
-# This is a summary; see above for details.
-#for $line in $gar_lines
-$line
-#end for
-#end if
-"""
-
-# http://www.cheetahtemplate.org/docs/users_guide_html_multipage/language.directives.closures.html
-TAG_REPORT_TMPL = u"""#if $errors
-# Tags reported by $name module
-#for $pkgname in $errors
-#for $tag in $errors[$pkgname]
-#if $tag.msg
-$textwrap.fill($tag.msg, 70, initial_indent="# ", subsequent_indent="# ")
-#end if
-$pkgname: ${tag.tag_name}#if $tag.tag_info# $tag.tag_info#end if#
-#end for
-#end for
-#end if
-"""
-
-
 class Error(Exception):
   pass
 
@@ -157,11 +56,12 @@
   pass
 
 
+class SetupError(Error):
+  pass
+
+
 def GetOptions():
   parser = optparse.OptionParser()
-  parser.add_option("-b", "--stats-basedir", dest="stats_basedir",
-                    help=("The base directory with package statistics "
-                          "in yaml format, e.g. ~/.checkpkg/stats"))
   parser.add_option("-d", "--debug", dest="debug",
                     default=False, action="store_true",
                     help="Turn on debugging messages")
@@ -172,8 +72,6 @@
                     default=False, action="store_true",
                     help=("Print less messages"))
   (options, args) = parser.parse_args()
-  if not options.stats_basedir:
-    raise ConfigurationError("ERROR: the -b option is missing.")
   # Using set() to make the arguments unique.
   return options, set(args)
 
@@ -191,863 +89,13 @@
 
 
 def ExtractBuildUsername(pkginfo):
-  m = re.match(PSTAMP_RE, pkginfo["PSTAMP"])
+  m = re.match(common_constants.PSTAMP_RE, pkginfo["PSTAMP"])
   return m.group("username") if m else None
 
 
-class SystemPkgmap(database.DatabaseClient):
-  """A class to hold and manipulate the /var/sadm/install/contents file."""
-
-  STOP_PKGS = ["SUNWbcp", "SUNWowbcp", "SUNWucb"]
-
-  def __init__(self, system_pkgmap_files=None, debug=False):
-    """There is no need to re-parse it each time.
-
-    Read it slowly the first time and cache it for later."""
-    super(SystemPkgmap, self).__init__(debug=debug)
-    self.cache = {}
-    self.pkgs_by_path_cache = {}
-    self.file_mtime = None
-    self.cache_mtime = None
-    self.initialized = False
-    if not system_pkgmap_files:
-      self.system_pkgmap_files = [SYSTEM_PKGMAP]
-    else:
-      self.system_pkgmap_files = system_pkgmap_files
-    self.csw_pkg_re = re.compile(CONTENT_PKG_RE)
-    self.digits_re = re.compile(r"^[0-9]+$")
-
-  def _LazyInitializeDatabase(self):
-    if not self.initialized:
-      self.InitializeDatabase()
-
-  def InitializeRawDb(self):
-    """It's necessary for low level operations."""
-    if True:
-      logging.debug("Connecting to sqlite")
-      self.sqlite_conn = sqlite3.connect(self.GetDatabasePath())
-
-  def InitializeDatabase(self):
-    """Established the connection to the database.
-
-    TODO: Refactor this class to first create CswFile with no primary key and
-          no indexes.
-    """
-    need_to_create_tables = False
-    db_path = self.GetDatabasePath()
-    checkpkg_dir = os.path.join(os.environ["HOME"], self.CHECKPKG_DIR)
-    if not os.path.exists(db_path):
-      logging.info("Building the  cache database %s.", self.system_pkgmap_files)
-      logging.info("The cache will be kept in %s.", db_path)
-      if not os.path.exists(checkpkg_dir):
-        logging.debug("Creating %s", checkpkg_dir)
-        os.mkdir(checkpkg_dir)
-      need_to_create_tables = True
-    self.InitializeRawDb()
-    self.InitializeSqlobject()
-    if not self.IsDatabaseGoodSchema():
-      logging.info("Old database schema detected.")
-      self.PurgeDatabase(drop_tables=True)
-      need_to_create_tables = True
-    if need_to_create_tables:
-      self.CreateTables()
-      self.PerformInitialDataImport()
-    if not self.IsDatabaseUpToDate():
-      logging.debug("Rebuilding the package cache, can take a few minutes.")
-      self.ClearTablesForUpdates()
-      self.RefreshDatabase()
-    self.initialized = True
-
-  def RefreshDatabase(self):
-    for pkgmap_path in self.system_pkgmap_files:
-      self._ProcessSystemPkgmap(pkgmap_path)
-    self.PopulatePackagesTable()
-    self.SetDatabaseMtime()
-
-  def PerformInitialDataImport(self):
-    """Imports data into the database.
-
-    Original bit of code from checkpkg:
-    egrep -v 'SUNWbcp|SUNWowbcp|SUNWucb' /var/sadm/install/contents |
-        fgrep -f $EXTRACTDIR/liblist >$EXTRACTDIR/shortcatalog
-    """
-    for pkgmap_path in self.system_pkgmap_files:
-      self._ProcessSystemPkgmap(pkgmap_path)
-    self.SetDatabaseSchemaVersion()
-    self.PopulatePackagesTable()
-    self.SetDatabaseMtime()
-
-  def _ProcessSystemPkgmap(self, pkgmap_path):
-    """Update the database using data from pkgmap.
-
-    The strategy to only update the necessary bits:
-      - for each new row
-        - look it up in the db
-          - if doesn't exist, create it
-          - if exists, check the
-          TODO: continue this description
-    """
-    INSERT_SQL = """
-    INSERT INTO csw_file (basename, path, line)
-    VALUES (?, ?, ?);
-    """
-    sqlite_cursor = self.sqlite_conn.cursor()
-    break_after = DEBUG_BREAK_PKGMAP_AFTER
-    contents_length = os.stat(pkgmap_path).st_size
-    if break_after:
-      estimated_lines = break_after
-    else:
-      estimated_lines = contents_length / INSTALL_CONTENTS_AVG_LINE_LENGTH
-    # The progressbar library doesn't like handling larger numbers
-    # It displays up to 99% if we feed it a maxval in the range of hundreds of
-    # thousands.
-    progressbar_divisor = int(estimated_lines / 1000)
-    if progressbar_divisor < 1:
-      progressbar_divisor = 1
-    update_period = 1L
-    # To help delete old records
-    system_pkgmap_fd = open(pkgmap_path, "r")
-    stop_re = re.compile("(%s)" % "|".join(self.STOP_PKGS))
-    # Creating a data structure:
-    # soname - {<path1>: <line1>, <path2>: <line2>, ...}
-    logging.debug("Building database cache db of the %s file",
-                  pkgmap_path)
-    logging.info("Processing %s, it can take a few minutes", pkgmap_path)
-    count = itertools.count()
-    bar = progressbar.ProgressBar()
-    bar.maxval = estimated_lines / progressbar_divisor
-    bar.start()
-    # I tried dropping the csw_file_basename_idx index to speed up operation,
-    # but after I measured the times, it turned out that it doesn't make any
-    # difference to the total runnng time.
-    # logging.info("Dropping csw_file_basename_idx")
-    # sqlite_cursor.execute("DROP INDEX csw_file_basename_idx;")
-    for line in system_pkgmap_fd:
-      i = count.next()
-      if not i % update_period and (i / progressbar_divisor) <= bar.maxval:
-        bar.update(i / progressbar_divisor)
-      if stop_re.search(line):
-        continue
-      if line.startswith("#"):
-        continue
-      fields = re.split(c.WS_RE, line)
-      pkgmap_entry_path = fields[0].split("=")[0]
-      pkgmap_entry_dir, pkgmap_entry_base_name = os.path.split(pkgmap_entry_path)
-      # The following SQLObject-driven inserts are 60 times slower than the raw
-      # sqlite API.
-      # pkgmap_entry = m.CswFile(basename=pkgmap_entry_base_name,
-      #                          path=pkgmap_entry_dir, line=line.strip())
-      # This page has some hints:
-      # http://www.mail-archive.com/sqlobject-discuss@lists.sourceforge.net/msg04641.html
-      # "These are simple straightforward INSERTs without any additional
-      # high-level burden - no SELECT, no caching, nothing. Fire and forget."
-      # sql = self.sqo_conn.sqlrepr(
-      #   sqlobject.sqlbuilder.Insert(m.CswFile.sqlmeta.table, values=record))
-      # self.sqo_conn.query(sql)
-      # ...unfortunately, it isn't any faster in practice.
-      # The fastest way is:
-      sqlite_cursor.execute(INSERT_SQL, [pkgmap_entry_base_name,
-                                         pkgmap_entry_dir,
-                                         line.strip()])
-      if break_after and i > break_after:
-        logging.warning("Breaking after %s for debugging purposes.", break_after)
-        break
-    bar.finish()
-    self.sqlite_conn.commit()
-    logging.debug("All lines of %s were processed.", pkgmap_path)
-
-  def _ParsePkginfoLine(self, line):
-    fields = re.split(c.WS_RE, line)
-    pkgname = fields[1]
-    pkg_desc = u" ".join(fields[2:])
-    return pkgname, pkg_desc
-
-  def PopulatePackagesTable(self):
-    logging.info("Updating the packages table")
-    args = ["pkginfo"]
-    pkginfo_proc = subprocess.Popen(args, stdout=subprocess.PIPE)
-    stdout, stderr = pkginfo_proc.communicate()
-    ret = pkginfo_proc.wait()
-    lines = stdout.splitlines()
-    bar = progressbar.ProgressBar()
-    bar.maxval = len(lines)
-    bar.start()
-    count = itertools.count()
-    INSERT_SQL = """
-    INSERT INTO pkginst (pkgname, pkg_desc)
-    VALUES (?, ?);
-    """
-    # If self.GetInstalledPackages calls out to the initialization,
-    # the result is an infinite recursion.
-    installed_pkgs = self.GetInstalledPackages(initialize=False)
-    for line in stdout.splitlines():
-      pkgname, pkg_desc = self._ParsePkginfoLine(line)
-      if pkgname not in installed_pkgs:
-        # This is slow:
-        # pkg = m.Pkginst(pkgname=pkgname, pkg_desc=pkg_desc)
-        # This is much faster:
-        self.sqlite_conn.execute(INSERT_SQL, [pkgname, pkg_desc])
-      i = count.next()
-      bar.update(i)
-    # Need to commit, otherwise subsequent SQLObject calls will fail.
-    self.sqlite_conn.commit()
-    bar.finish()
-
-  def SetDatabaseMtime(self):
-    mtime = self.GetFileMtime()
-    res = m.CswConfig.select(m.CswConfig.q.option_key==CONFIG_MTIME)
-    if res.count() == 0:
-      logging.debug("Inserting the mtime (%s) into the database.", mtime)
-      config_record = m.CswConfig(option_key=CONFIG_MTIME, float_value=mtime)
-    else:
-      logging.debug("Updating the mtime (%s) in the database.", mtime)
-      res.getOne().float_value = mtime
-
-  def SetDatabaseSchemaVersion(self):
-    try:
-      config_option = m.CswConfig.select(
-          m.CswConfig.q.option_key==CONFIG_DB_SCHEMA).getOne()
-      config_option.int_value = database.DB_SCHEMA_VERSION
-    except sqlobject.main.SQLObjectNotFound, e:
-      version = m.CswConfig(option_key=CONFIG_DB_SCHEMA,
-                            int_value=database.DB_SCHEMA_VERSION)
-
-  def GetPkgmapLineByBasename(self, filename):
-    """Returns pkgmap lines by basename:
-      {
-        path1: line1,
-        path2: line2,
-      }
-    """
-    if filename in self.cache:
-      return self.cache[filename]
-    self._LazyInitializeDatabase()
-    res = m.CswFile.select(m.CswFile.q.basename==filename)
-    lines = {}
-    for obj in res:
-      lines[obj.path] = obj.line
-    if len(lines) == 0:
-      logging.debug("Cache doesn't contain filename %s", filename)
-    self.cache[filename] = lines
-    return lines
-
-  def _InferPackagesFromPkgmapLine(self, line):
-    """Given a pkgmap line, return all packages it contains."""
-    line = line.strip()
-    parts = re.split(c.WS_RE, line)
-    pkgs = []
-    if parts[1] == 'd':
-      parts = parts[6:]
-    while parts:
-      part = parts.pop()
-      if self.digits_re.match(part):
-        break
-      elif "none" == part:
-        break
-      pkgs.append(part)
-    # Make the packages appear in the same order as in the install/contents
-    # file.
-    pkgs.reverse()
-    return pkgs
-
-  def GetPathsAndPkgnamesByBasename(self, filename):
-    """Returns paths and packages by basename.
-
-    e.g.
-    {"/opt/csw/lib": ["CSWfoo", "CSWbar"],
-     "/opt/csw/1/lib": ["CSWfoomore"]}
-    """
-    lines = self.GetPkgmapLineByBasename(filename)
-    pkgs = {}
-    # Infer packages
-    for file_path in lines:
-      pkgs[file_path] = self._InferPackagesFromPkgmapLine(lines[file_path])
-    # self.error_mgr_mock.GetPathsAndPkgnamesByBasename('libc.so.1').AndReturn({
-    #       "/usr/lib": (u"SUNWcsl",)})
-    logging.debug("self.error_mgr_mock.GetPathsAndPkgnamesByBasename(%s).AndReturn(%s)",
-                  repr(filename), pprint.pformat(pkgs))
-    return pkgs
-
-  def GetPkgByPath(self, full_path):
-    if full_path not in self.pkgs_by_path_cache:
-      self._LazyInitializeDatabase()
-      path, basename = os.path.split(full_path)
-      try:
-        obj = m.CswFile.select(
-            sqlobject.AND(
-              m.CswFile.q.path==path,
-              m.CswFile.q.basename==basename)).getOne()
-        self.pkgs_by_path_cache[full_path] = self._InferPackagesFromPkgmapLine(
-            obj.line)
-      except sqlobject.main.SQLObjectNotFound, e:
-        logging.debug("Couldn't find in the db: %s/%s", path, basename)
-        logging.debug(e)
-        self.pkgs_by_path_cache[full_path] = []
-    logging.debug("self.error_mgr_mock.GetPkgByPath(%s).AndReturn(%s)",
-                  repr(full_path), pprint.pformat(self.pkgs_by_path_cache[full_path]))
-    return self.pkgs_by_path_cache[full_path]
-
-  def GetDatabaseMtime(self):
-    if not self.cache_mtime:
-      res = m.CswConfig.select(m.CswConfig.q.option_key==CONFIG_MTIME)
-      if res.count() == 1:
-        self.cache_mtime = res.getOne().float_value
-      elif res.count() < 1:
-        self.cache_mtime = 1
-    logging.debug("GetDatabaseMtime() --> %s", self.cache_mtime)
-    return self.cache_mtime
-
-  def GetFileMtime(self):
-    if not self.file_mtime:
-      stat_data = os.stat(SYSTEM_PKGMAP)
-      self.file_mtime = stat_data.st_mtime
-    return self.file_mtime
-
-  def GetDatabaseSchemaVersion(self):
-    schema_on_disk = 1L
-    if not m.CswConfig.tableExists():
-      return schema_on_disk;
-    res = m.CswConfig.select(m.CswConfig.q.option_key == CONFIG_DB_SCHEMA)
-    if res.count() < 1:
-      logging.debug("No db schema value found, assuming %s.",
-                   schema_on_disk)
-    elif res.count() == 1:
-      schema_on_disk = res.getOne().int_value
-    return schema_on_disk
-
-  def IsDatabaseUpToDate(self):
-    f_mtime_epoch = self.GetFileMtime()
-    d_mtime_epoch = self.GetDatabaseMtime()
-    f_mtime = time.gmtime(int(f_mtime_epoch))
-    d_mtime = time.gmtime(int(d_mtime_epoch))
-    logging.debug("IsDatabaseUpToDate: f_mtime %s, d_time: %s", f_mtime, d_mtime)
-    # Rounding up to integer seconds.  There is a race condition: 
-    # pkgadd finishes at 100.1
-    # checkpkg reads /var/sadm/install/contents at 100.2
-    # new pkgadd runs and finishes at 100.3
-    # subsequent checkpkg runs won't pick up the last change.
-    # I don't expect pkgadd to run under 1s.
-    fresh = f_mtime <= d_mtime
-    good_version = self.GetDatabaseSchemaVersion() >= database.DB_SCHEMA_VERSION
-    logging.debug("IsDatabaseUpToDate: good_version=%s, fresh=%s",
-                  repr(good_version), repr(fresh))
-    return fresh and good_version
-
-  def ClearTablesForUpdates(self):
-    for table in self.TABLES_THAT_NEED_UPDATES:
-      table.clearTable()
-
-  def PurgeDatabase(self, drop_tables=False):
-    if drop_tables:
-      for table in self.TABLES:
-        if table.tableExists():
-          table.dropTable()
-    else:
-      logging.debug("Truncating all tables")
-      for table in self.TABLES:
-        table.clearTable()
-
-  def GetInstalledPackages(self, initialize=True):
-    """Returns a dictionary of all installed packages."""
-    if initialize:
-      self._LazyInitializeDatabase()
-    res = m.Pkginst.select()
-    return dict([[str(x.pkgname), str(x.pkg_desc)] for x in res])
-
-
-class LddEmulator(object):
-  """A class to emulate ldd(1)
-
-  Used primarily to resolve SONAMEs and detect package dependencies.
-  """
-  def __init__(self):
-    self.runpath_expand_cache = {}
-    self.runpath_origin_expand_cache = {}
-    self.symlink_expand_cache = {}
-    self.symlink64_cache = {}
-    self.runpath_sanitize_cache = {}
-
-  def ExpandRunpath(self, runpath, isalist, binary_path):
-    """Expands a signle runpath element.
-
-    Args:
-      runpath: e.g. "/opt/csw/lib/$ISALIST"
-      isalist: isalist elements
-      binary_path: Necessary to expand $ORIGIN
-    """
-    key = (runpath, tuple(isalist))
-    if key not in self.runpath_expand_cache:
-      origin_present = False
-      # Emulating $ISALIST and $ORIGIN expansion
-      if '$ORIGIN' in runpath:
-        origin_present = True
-      if origin_present:
-        key_o = (runpath, tuple(isalist), binary_path)
-        if key_o in self.runpath_origin_expand_cache:
-          return self.runpath_origin_expand_cache[key_o]
-        else:
-          if not binary_path.startswith("/"):
-            binary_path = "/" + binary_path
-          runpath = runpath.replace('$ORIGIN', binary_path)
-      if '$ISALIST' in runpath:
-        expanded_list  = [runpath.replace('/$ISALIST', '')]
-        expanded_list += [runpath.replace('$ISALIST', isa) for isa in isalist]
-      else:
-        expanded_list = [runpath]
-      expanded_list = [os.path.abspath(p) for p in expanded_list]
-      if not origin_present:
-        self.runpath_expand_cache[key] = expanded_list
-      else:
-        self.runpath_origin_expand_cache[key_o] = expanded_list
-        return self.runpath_origin_expand_cache[key_o]
-    return self.runpath_expand_cache[key]
-
-  def ExpandSymlink(self, symlink, target, input_path):
-    key = (symlink, target, input_path)
-    if key not in self.symlink_expand_cache:
-      symlink_re = re.compile(r"%s(/|$)" % symlink)
-      if re.search(symlink_re, input_path):
-        result = input_path.replace(symlink, target)
-      else:
-        result = input_path
-      self.symlink_expand_cache[key] = result
-    return self.symlink_expand_cache[key]
-
-  def Emulate64BitSymlinks(self, runpath_list):
-    """Need to emulate the 64 -> amd64, 64 -> sparcv9 symlink
-
-    Since we don't know the architecture, we are adding both amd64 and
-    sparcv9.  It should be safe - there are other checks that make sure
-    that right architectures are in the right directories.
-    """
-    key = tuple(runpath_list)
-    if key not in self.symlink64_cache:
-      symlinked_list = []
-      for runpath in runpath_list:
-        for symlink, expansion_list in SYSTEM_SYMLINKS:
-          for target in expansion_list:
-            expanded = self.ExpandSymlink(symlink, target, runpath)
-            if expanded not in symlinked_list:
-              symlinked_list.append(expanded)
-      self.symlink64_cache[key] = symlinked_list
-    return self.symlink64_cache[key]
-
-  def SanitizeRunpath(self, runpath):
-    if runpath not in self.runpath_sanitize_cache:
-      self.runpath_sanitize_cache[runpath] = os.path.normpath(runpath)
-    return self.runpath_sanitize_cache[runpath]
-
-
-  def ResolveSoname(self, runpath_list, soname, isalist,
-                    path_list, binary_path):
-    """Emulates ldd behavior, minimal implementation.
-
-    runpath: e.g. ["/opt/csw/lib/$ISALIST", "/usr/lib"]
-    soname: e.g. "libfoo.so.1"
-    isalist: e.g. ["sparcv9", "sparcv8"]
-    path_list: A list of paths where the soname is present, e.g.
-               ["/opt/csw/lib", "/opt/csw/lib/sparcv9"]
-
-    The function returns the one path.
-    """
-    # Emulating the install time symlinks, for instance, if the prototype contains
-    # /opt/csw/lib/i386/foo.so.0 and /opt/csw/lib/i386 is a symlink to ".",
-    # the shared library ends up in /opt/csw/lib/foo.so.0 and should be
-    # findable even when RPATH does not contain $ISALIST.
-    original_paths_by_expanded_paths = {}
-    for p in path_list:
-      expanded_p_list = self.Emulate64BitSymlinks([p])
-      # We can't just expand and return; we need to return one of the paths given
-      # in the path_list.
-      for expanded_p in expanded_p_list:
-        original_paths_by_expanded_paths[expanded_p] = p
-    logging.debug(
-        "%s: looking for %s in %s",
-        soname, runpath_list, original_paths_by_expanded_paths.keys())
-    for runpath_expanded in runpath_list:
-      if runpath_expanded in original_paths_by_expanded_paths:
-        # logging.debug("Found %s",
-        #               original_paths_by_expanded_paths[runpath_expanded])
-        return original_paths_by_expanded_paths[runpath_expanded]
-
-
-def ParseDumpOutput(dump_output):
-  binary_data = {RUNPATH: [],
-                 NEEDED_SONAMES: []}
-  runpath = []
-  rpath = []
-  for line in dump_output.splitlines():
-    fields = re.split(c.WS_RE, line)
-    if len(fields) < 3:
-      continue
-    if fields[1] == "NEEDED":
-      binary_data[NEEDED_SONAMES].append(fields[2])
-    elif fields[1] == "RUNPATH":
-      runpath.extend(fields[2].split(":"))
-    elif fields[1] == "RPATH":
-      rpath.extend(fields[2].split(":"))
-    elif fields[1] == "SONAME":
-      binary_data[SONAME] = fields[2]
-  if runpath:
-    binary_data[RUNPATH].extend(runpath)
-  elif rpath:
-    binary_data[RUNPATH].extend(rpath)
-
-  # Converting runpath to a tuple, which is a hashable data type and can act as
-  # a key in a dict.
-  binary_data[RUNPATH] = tuple(binary_data[RUNPATH])
-  # the NEEDED list must not be modified, converting to a tuple.
-  binary_data[NEEDED_SONAMES] = tuple(binary_data[NEEDED_SONAMES])
-  binary_data["RUNPATH RPATH the same"] = (runpath == rpath)
-  binary_data["RPATH set"] = bool(rpath)
-  binary_data["RUNPATH set"] = bool(runpath)
-  return binary_data
-
-
-class CheckpkgManagerBase(object):
-  """Common functions between the older and newer calling functions."""
-
-  def __init__(self, name, stats_basedir, md5sum_list, debug=False):
-    self.debug = debug
-    self.name = name
-    self.md5sum_list = md5sum_list
-    self.stats_basedir = stats_basedir
-    self.errors = []
-    self.individual_checks = []
-    self.set_checks = []
-    self.packages = []
-
-  def GetPackageStatsList(self):
-    return [package_stats.PackageStats(None, self.stats_basedir, x)
-            for x in self.md5sum_list]
-
-  def FormatReports(self, errors, messages, gar_lines):
-    namespace = {
-        "name": self.name,
-        "errors": errors,
-        "debug": self.debug,
-        "textwrap": textwrap,
-        "messages": messages,
-        "gar_lines": gar_lines,
-    }
-    screen_t = Template.Template(SCREEN_ERROR_REPORT_TMPL, searchList=[namespace])
-    tags_report_t = Template.Template(TAG_REPORT_TMPL, searchList=[namespace])
-    return screen_t, tags_report_t
-
-  def SetErrorsToDict(self, set_errors, a_dict):
-    # These were generated by a set, but are likely to be bound to specific
-    # packages. We'll try to preserve the package assignments.
-    errors = copy.copy(a_dict)
-    for tag in set_errors:
-      if tag.pkgname:
-        if not tag.pkgname in errors:
-          errors[tag.pkgname] = []
-        errors[tag.pkgname].append(tag)
-      else:
-        if "package-set" not in errors:
-          errors["package-set"] = []
-        errors["package-set"].append(tag)
-    return errors
-
-  def GetOptimizedAllStats(self, stats_obj_list):
-    logging.info("Unwrapping candies...")
-    pkgs_data = []
-    counter = itertools.count()
-    length = len(stats_obj_list)
-    bar = progressbar.ProgressBar()
-    bar.maxval = length
-    bar.start()
-    for stats_obj in stats_obj_list:
-      # pkg_data = {}
-      # This bit is tightly tied to the data structures returned by
-      # PackageStats.
-      #
-      # Python strings are already implementing the flyweight pattern. What's
-      # left is lists and dictionaries.
-      i = counter.next()
-      # logging.debug("Loading stats for %s (%s/%s)",
-      #               stats_obj.md5sum, i, length)
-      raw_pkg_data = stats_obj.GetAllStats()
-      pkg_data = raw_pkg_data
-      pkgs_data.append(pkg_data)
-      bar.update(i)
-    bar.finish()
-    return pkgs_data
-
-  def Run(self):
-    """Runs all the checks
-
-    Returns a tuple of an exit code and a report.
-    """
-    packages_data = self.GetPackageStatsList()
-    db_stat_objs_by_pkgname = {}
-    obj_id_list = []
-    for pkg in packages_data:
-      db_obj = pkg.GetDbObject()
-      db_stat_objs_by_pkgname[db_obj.pkginst.pkgname] = db_obj
-      obj_id_list.append(db_obj.id)
-    logging.debug("Deleting old %s errors from the database.",
-                  db_obj.pkginst.pkgname)
-    conn = sqlobject.sqlhub.processConnection
-    # It's the maximum number of ORs in a SQL statement.
-    # Slicing the long list up into s-sized segments.  1000 is too much.
-    obj_id_lists = SliceList(obj_id_list, 900)
-    for obj_id_list in obj_id_lists:
-      # WARNING: This is raw SQL, potentially breaking during a transition to
-      # another db.  It's here for efficiency.
-      sql = ("DELETE FROM checkpkg_error_tag WHERE %s;"
-             % " OR ".join("srv4_file_id = %s" % x for x in obj_id_list))
-      conn.query(sql)
-    # Need to construct the predicate by hand.  Otherwise:
-    # File "/opt/csw/lib/python/site-packages/sqlobject/sqlbuilder.py",
-    # line 829, in OR
-    # return SQLOp("OR", op1, OR(*ops))
-    # RuntimeError: maximum recursion depth exceeded while calling a Python object
-    #
-    # The following also tries to use recursion and fails.
-    # delete_predicate = sqlobject.OR(False)
-    # for pred in delete_predicate_list:
-    #   delete_predicate = sqlobject.OR(delete_predicate, pred)
-    # conn.query(
-    #     conn.sqlrepr(sqlbuilder.Delete(m.CheckpkgErrorTag.sqlmeta.table,
-    #       delete_predicate
-    #     )))
-      # res = m.CheckpkgErrorTag.select(m.CheckpkgErrorTag.q.srv4_file==db_obj)
-      # for obj in res:
-      #   obj.destroySelf()
-    errors, messages, gar_lines = self.GetAllTags(packages_data)
-    no_errors = len(errors) + 1
-    bar = progressbar.ProgressBar()
-    bar.maxval = no_errors
-    count = itertools.count(1)
-    logging.info("Stuffing the candies under the pillow...")
-    bar.start()
-    for pkgname, es in errors.iteritems():
-      logging.debug("Saving %s errors to the database.", pkgname)
-      for e in es:
-        db_error = m.CheckpkgErrorTag(srv4_file=db_stat_objs_by_pkgname[e.pkgname],
-                                      pkgname=e.pkgname,
-                                      tag_name=e.tag_name,
-                                      tag_info=e.tag_info,
-                                      msg=e.msg)
-      bar.update(count.next())
-    bar.finish()
-    flat_error_list = reduce(operator.add, errors.values(), [])
-    screen_report, tags_report = self.FormatReports(errors, messages, gar_lines)
-    exit_code = 0
-    return (exit_code, screen_report, tags_report)
-
-
-class CheckInterfaceBase(object):
-  """Proxies interaction with checking functions.
-
-  It wraps access to the /var/sadm/install/contents cache.
-  """
-
-  def __init__(self, system_pkgmap=None, lines_dict=None):
-    self.system_pkgmap = system_pkgmap
-    if not self.system_pkgmap:
-      self.system_pkgmap = SystemPkgmap()
-    self.common_paths = {}
-    if lines_dict:
-      self.lines_dict = lines_dict
-    else:
-      self.lines_dict = {}
-
-  def GetPathsAndPkgnamesByBasename(self, basename):
-    """Proxies calls to self.system_pkgmap."""
-    return self.system_pkgmap.GetPathsAndPkgnamesByBasename(basename)
-
-  def GetPkgByPath(self, path):
-    """Proxies calls to self.system_pkgmap."""
-    return self.system_pkgmap.GetPkgByPath(path)
-
-  def GetInstalledPackages(self, initialize=True):
-    return self.system_pkgmap.GetInstalledPackages(initialize)
-
-  def _GetPathsForArch(self, arch):
-    if not arch in self.lines_dict:
-      file_name = os.path.join(
-          os.path.dirname(__file__), "..", "..", "etc", "commondirs-%s" % arch)
-      logging.debug("opening %s", file_name)
-      f = open(file_name, "r")
-      self.lines_dict[arch] = f.read().splitlines()
-      f.close()
-    return self.lines_dict[arch]
-
-  def GetCommonPaths(self, arch):
-    """Returns a list of paths for architecture, from gar/etc/commondirs*."""
-    # TODO: If this was cached, it could save a significant amount of time.
-    if arch not in ('i386', 'sparc', 'all'):
-      logging.warn("Wrong arch: %s", repr(arch))
-      return []
-    if arch == 'all':
-      archs = ('i386', 'sparc')
-    else:
-      archs = [arch]
-    lines = []
-    for arch in archs:
-      lines.extend(self._GetPathsForArch(arch))
-    return lines
-
-
-class IndividualCheckInterface(CheckInterfaceBase):
-  """To be passed to the checking functions.
-
-  Wraps the creation of tag.CheckpkgTag objects.
-  """
-
-  def __init__(self, pkgname, system_pkgmap=None):
-    super(IndividualCheckInterface, self).__init__(system_pkgmap)
-    self.pkgname = pkgname
-    self.errors = []
-
-  def ReportError(self, tag_name, tag_info=None, msg=None):
-    logging.debug("self.error_mgr_mock.ReportError(%s, %s, %s)",
-                  repr(tag_name), repr(tag_info), repr(msg))
-    checkpkg_tag = tag.CheckpkgTag(self.pkgname, tag_name, tag_info, msg=msg)
-    self.errors.append(checkpkg_tag)
-
-
-class SetCheckInterface(CheckInterfaceBase):
-  """To be passed to set checking functions."""
-
-  def __init__(self, system_pkgmap=None):
-    super(SetCheckInterface, self).__init__(system_pkgmap)
-    self.errors = []
-
-  def ReportError(self, pkgname, tag_name, tag_info=None, msg=None):
-    logging.debug("self.error_mgr_mock.ReportError(%s, %s, %s, %s)",
-                  repr(pkgname),
-                  repr(tag_name), repr(tag_info), repr(msg))
-    checkpkg_tag = tag.CheckpkgTag(pkgname, tag_name, tag_info, msg=msg)
-    self.errors.append(checkpkg_tag)
-
-
-class CheckpkgMessenger(object):
-  """Class responsible for passing messages from checks to the user."""
-  def __init__(self):
-    self.messages = []
-    self.one_time_messages = {}
-    self.gar_lines = []
-
-  def Message(self, m):
-    logging.debug("self.messenger.Message(%s)", repr(m))
-    self.messages.append(m)
-
-  def OneTimeMessage(self, key, m):
-    logging.debug("self.messenger.OneTimeMessage(%s, %s)", repr(key), repr(m))
-    if key not in self.one_time_messages:
-      self.one_time_messages[key] = m
-
-  def SuggestGarLine(self, m):
-    logging.debug("self.messenger.SuggestGarLine(%s)", repr(m))
-    self.gar_lines.append(m)
-
-
-class CheckpkgManager2(CheckpkgManagerBase):
-  """The second incarnation of the checkpkg manager.
-
-  Implements the API to be used by checking functions.
-
-  Its purpose is to reduce the amount of boilerplate code and allow for easier
-  unit test writing.
-  """
-  def _RegisterIndividualCheck(self, function):
-    self.individual_checks.append(function)
-
-  def _RegisterSetCheck(self, function):
-    self.set_checks.append(function)
-
-  def _AutoregisterChecks(self):
-    """Autodetects all defined checks."""
-    logging.debug("CheckpkgManager2._AutoregisterChecks()")
-    checkpkg_module = package_checks
-    members = dir(checkpkg_module)
-    for member_name in members:
-      logging.debug("Examining module member: %s", repr(member_name))
-      member = getattr(checkpkg_module, member_name)
-      if callable(member):
-        if member_name.startswith("Check"):
-          logging.debug("Registering individual check %s", repr(member_name))
-          self._RegisterIndividualCheck(member)
-        elif member_name.startswith("SetCheck"):
-          logging.debug("Registering set check %s", repr(member_name))
-          self._RegisterSetCheck(member)
-
-  def GetAllTags(self, stats_obj_list):
-    errors = {}
-    pkgmap = SystemPkgmap()
-    logging.debug("Loading all package statistics.")
-    pkgs_data = self.GetOptimizedAllStats(stats_obj_list)
-    logging.debug("All package statistics loaded.")
-    messenger = CheckpkgMessenger()
-    # Individual checks
-    count = itertools.count()
-    bar = progressbar.ProgressBar()
-    bar.maxval = len(pkgs_data) * len(self.individual_checks)
-    logging.info("Tasting candies one by one...")
-    bar.start()
-    for pkg_data in pkgs_data:
-      pkgname = pkg_data["basic_stats"]["pkgname"]
-      check_interface = IndividualCheckInterface(pkgname, pkgmap)
-      for function in self.individual_checks:
-        logger = logging.getLogger("%s-%s" % (pkgname, function.__name__))
-        logger.debug("Calling %s", function.__name__)
-        function(pkg_data, check_interface, logger=logger, messenger=messenger)
-        if check_interface.errors:
-          errors[pkgname] = check_interface.errors
-        bar.update(count.next())
-    bar.finish()
-    # Set checks
-    logging.info("Tasting them all at once...")
-    for function in self.set_checks:
-      logger = logging.getLogger(function.__name__)
-      check_interface = SetCheckInterface(pkgmap)
-      logger.debug("Calling %s", function.__name__)
-      function(pkgs_data, check_interface, logger=logger, messenger=messenger)
-      if check_interface.errors:
-        errors = self.SetErrorsToDict(check_interface.errors, errors)
-    messages = messenger.messages + messenger.one_time_messages.values()
-    return errors, messages, messenger.gar_lines
-
-  def Run(self):
-    self._AutoregisterChecks()
-    return super(CheckpkgManager2, self).Run()
-
-
-def GetIsalist():
-  args = ["isalist"]
-  isalist_proc = subprocess.Popen(args, stdout=subprocess.PIPE)
-  stdout, stderr = isalist_proc.communicate()
-  ret = isalist_proc.wait()
-  if ret:
-    logging.error("Calling isalist has failed.")
-  isalist = re.split(r"\s+", stdout.strip())
-  return tuple(isalist)
-
-
-def ErrorTagsFromFile(file_name):
-  fd = open(file_name)
-  error_tags = []
-  for line in fd:
-    if line.startswith("#"):
-      continue
-    pkgname, tag_name, tag_info = tag.ParseTagLine(line)
-    error_tags.append(tag.CheckpkgTag(pkgname, tag_name, tag_info))
-  return error_tags
-
-
-def SliceList(l, size):
-  """Trasforms a list into a list of lists."""
-  idxes = xrange(0, len(l), size)
-  sliced = [l[i:i+size] for i in idxes]
-  return sliced
-
 def IsMd5(s):
   # For optimization, move the compilation elsewhere.
-  md5_re = re.compile(MD5_RE)
-  return md5_re.match(s)
+  return MD5_RE.match(s)
 
 def GetPackageStatsByFilenamesOrMd5s(args, debug=False):
   filenames = []
@@ -1059,15 +107,15 @@
       filenames.append(arg)
   srv4_pkgs = [inspective_package.InspectiveCswSrv4File(x) for x in filenames]
   pkgstat_objs = []
-  bar = progressbar.ProgressBar()
-  bar.maxval = len(md5s) + len(srv4_pkgs)
-  bar.start()
+  pbar = progressbar.ProgressBar()
+  pbar.maxval = len(md5s) + len(srv4_pkgs)
+  pbar.start()
   counter = itertools.count()
   for pkg in srv4_pkgs:
     pkgstat_objs.append(package_stats.PackageStats(pkg, debug=debug))
-    bar.update(counter.next())
+    pbar.update(counter.next())
   for md5 in md5s:
     pkgstat_objs.append(package_stats.PackageStats(None, md5sum=md5, debug=debug))
-    bar.update(counter.next())
-  bar.finish()
+    pbar.update(counter.next())
+  pbar.finish()
   return pkgstat_objs

Copied: csw/mgar/gar/v2-fortran/lib/python/checkpkg2.py (from rev 12514, csw/mgar/gar/v2/lib/python/checkpkg2.py)
===================================================================
--- csw/mgar/gar/v2-fortran/lib/python/checkpkg2.py	                        (rev 0)
+++ csw/mgar/gar/v2-fortran/lib/python/checkpkg2.py	2011-01-12 19:26:11 UTC (rev 12515)
@@ -0,0 +1,179 @@
+#!/usr/bin/env python2.6
+#
+# checkpkg
+#
+
+import logging
+import operator
+import optparse
+import os
+import sys
+import textwrap
+import configuration
+import datetime
+import database
+
+import package_stats
+import checkpkg
+import checkpkg_lib
+import overrides
+import models
+import sqlobject
+
+USAGE = """%prog [ options ] pkg1 [ pkg2 [ ... ] ]"""
+CHECKPKG_MODULE_NAME = "The main checking module."
+BEFORE_OVERRIDES = """If any of the reported errors were false positives, you
+can override them pasting the lines below to the GAR recipe."""
+
+AFTER_OVERRIDES = """Please note that checkpkg isn't suggesting you should
+simply add these overrides do the Makefile.  It only informs what the overrides
+could look like.  You need to understand what are the reported issues about and
+use your best judgement to decide whether to fix the underlying problems or
+override them. For more information, scroll up and read the detailed
+messages."""
+
+UNAPPLIED_OVERRIDES = """WARNING: Some overrides did not match any errors.
+They can be removed, as they don't take any effect anyway.  If you're getting
+errors at the same time, maybe you didn't specify the overrides correctly."""
+
+
+class Error(Exception):
+  """Generic error."""
+
+
+class UsageError(Error):
+  """Problem with usage, e.g. command line options."""
+
+
+def main():
+  parser = optparse.OptionParser(USAGE)
+  parser.add_option("-d", "--debug",
+      dest="debug",
+      action="store_true",
+      default=False,
+      help="Switch on debugging messages")
+  parser.add_option("-q", "--quiet",
+      dest="quiet",
+      action="store_true",
+      default=False,
+      help="Display less messages")
+  parser.add_option("--catalog-release",
+      dest="catrel",
+      default="current",
+      help="A catalog release: current, unstable, testing, stable.")
+  parser.add_option("-r", "--os-releases",
+      dest="osrel_commas",
+      help=("Comma separated list of ['SunOS5.9', 'SunOS5.10'], "
+            "e.g. 'SunOS5.9,SunOS5.10'."))
+  parser.add_option("-a", "--architecture",
+      dest="arch",
+      help="Architecture: i386, sparc.")
+  parser.add_option("--profile", dest="profile",
+      default=False, action="store_true",
+      help="Enable profiling (a developer option).")
+  options, args = parser.parse_args()
+  assert len(args), "The list of files or md5 sums must be not empty."
+  logging_level = logging.INFO
+  if options.quiet:
+    logging_level = logging.WARNING
+  elif options.debug:
+    # If both flags are set, debug wins.
+    logging_level = logging.DEBUG
+  logging.basicConfig(level=logging_level)
+  logging.debug("Starting.")
+
+  configuration.SetUpSqlobjectConnection()
+  dm = database.DatabaseManager()
+  dm.AutoManage()
+
+
+  err_msg_list = []
+  if not options.osrel_commas:
+    err_msg_list.append("Please specify --os-releases.")
+  if not options.arch:
+    err_msg_list.append("Please specify --architecture.")
+  if err_msg_list:
+    raise UsageError(" ".join(err_msg_list))
+
+  stats_list = []
+  collector = package_stats.StatsCollector(
+      logger=logging,
+      debug=options.debug)
+  # We need to separate files and md5 sums.
+  md5_sums, file_list = [], []
+  for arg in args:
+    if checkpkg.MD5_RE.match(arg):
+      md5_sums.append(arg)
+    else:
+      file_list.append(arg)
+  if file_list:
+    stats_list = collector.CollectStatsFromFiles(file_list, None)
+  # We need the md5 sums of these files
+  md5_sums.extend([x["basic_stats"]["md5_sum"] for x in stats_list])
+  assert md5_sums, "The list of md5 sums must not be empty."
+  logging.debug("md5_sums: %s", md5_sums)
+  osrel_list = options.osrel_commas.split(",")
+  logging.debug("Reading packages data from the database.")
+  # This part might need improvements in order to handle a whole
+  # catalog.  On the other hand, if we already have the whole catalog in
+  # the database, we can do it altogether differently.
+  # Transforming the result to a list in order to force object
+  # retrieval.
+  sqo_pkgs = list(models.Srv4FileStats.select(
+    sqlobject.IN(models.Srv4FileStats.q.md5_sum, md5_sums)))
+  tags_for_all_osrels = []
+  sqo_arch = models.Architecture.selectBy(name=options.arch).getOne()
+  sqo_catrel = models.CatalogRelease.selectBy(name=options.catrel).getOne()
+  for osrel in osrel_list:
+    sqo_osrel = models.OsRelease.selectBy(short_name=osrel).getOne()
+    dm.VerifyContents(sqo_osrel, sqo_arch)
+    check_manager = checkpkg_lib.CheckpkgManager2(
+        CHECKPKG_MODULE_NAME,
+        sqo_pkgs,
+        osrel,

@@ Diff output truncated at 100000 characters. @@

This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.


More information about the devel mailing list