bucketed_patche

10
1 patch for repository http://darcs.net/screened: Tue Jun 24 21:31:17 ART 2014 Marcio Diaz <[email protected]> * bucketed cachezre zc New patches: [bucketed cache Marcio Diaz <[email protected]>**20140625003117 Ignore-this: 1f668e6658483441137fc5ed0cbc1e9d ] hunk ./src/Darcs/Repository.hs 167 , CacheLoc(..) , WritableOrNot(..) , hashedDir + , bucketFolder , CacheType(Directory) , reportBadSources ) hunk ./src/Darcs/Repository.hs 666 BL.writeFile x' y renameFile x' z writeFile' (Just ca) z y = do - let x' = joinPath . tail $ splitPath z -- drop darcsdir - ex <- doesFileExist $ ca </> x' + let fileFullPath = case splitPath z of + _:hDir:hFile:_ -> joinPath [ca, hDir, bucketFolder hFile, hFile] + _ -> fail "Unexpected file path" + ex <- doesFileExist fileFullPath if ex hunk ./src/Darcs/Repository.hs 671 - then createLink' (ca </> x') z - else withTemp $ \x'' -> do - BL.writeFile x'' y - createLink' x'' $ ca </> x' - renameFile x'' z + then createLink' fileFullPath z + else withTemp $ \x'' -> do + BL.writeFile x'' y + createLink' x'' fileFullPath + renameFile x'' z createLink' z y = do createDirectoryIfMissing True $ takeDirectory y createLink z y `catchall` return () hunk ./src/Darcs/Repository/Cache.hs 13 , WritableOrNot(..) , HashedDir(..) , hashedDir + , bucketFolder , unionCaches , unionRemoteCaches , cleanCaches hunk ./src/Darcs/Repository/Cache.hs 37 import qualified Data.ByteString.Char8 as BC (unpack) import Data.List ( nub, intercalate ) import Data.Maybe ( catMaybes, listToMaybe, fromMaybe ) -import System.FilePath.Posix ( (</>) ) +import System.FilePath.Posix ( (</>), joinPath ) import System.Directory ( createDirectoryIfMissing, removeFile, doesFileExist, doesDirectoryExist, getDirectoryContents,

Upload: marcio-diaz

Post on 21-Jul-2016

9 views

Category:

Documents


0 download

DESCRIPTION

bucketed

TRANSCRIPT

Page 1: bucketed_patche

1 patch for repository http://darcs.net/screened:

Tue Jun 24 21:31:17 ART 2014 Marcio Diaz <[email protected]> * bucketed cachezrezc

New patches:

[bucketed cacheMarcio Diaz <[email protected]>**20140625003117 Ignore-this: 1f668e6658483441137fc5ed0cbc1e9d] hunk ./src/Darcs/Repository.hs 167 , CacheLoc(..) , WritableOrNot(..) , hashedDir+ , bucketFolder , CacheType(Directory) , reportBadSources )hunk ./src/Darcs/Repository.hs 666 BL.writeFile x' y renameFile x' z writeFile' (Just ca) z y = do- let x' = joinPath . tail $ splitPath z -- drop darcsdir- ex <- doesFileExist $ ca </> x'+ let fileFullPath = case splitPath z of+ _:hDir:hFile:_ -> joinPath [ca, hDir, bucketFolder hFile, hFile]+ _ -> fail "Unexpected file path"+ ex <- doesFileExist fileFullPath if exhunk ./src/Darcs/Repository.hs 671- then createLink' (ca </> x') z- else withTemp $ \x'' -> do- BL.writeFile x'' y- createLink' x'' $ ca </> x'- renameFile x'' z+ then createLink' fileFullPath z+ else withTemp $ \x'' -> do+ BL.writeFile x'' y+ createLink' x'' fileFullPath+ renameFile x'' z createLink' z y = do createDirectoryIfMissing True $ takeDirectory y createLink z y `catchall` return ()hunk ./src/Darcs/Repository/Cache.hs 13 , WritableOrNot(..) , HashedDir(..) , hashedDir+ , bucketFolder , unionCaches , unionRemoteCaches , cleanCacheshunk ./src/Darcs/Repository/Cache.hs 37 import qualified Data.ByteString.Char8 as BC (unpack) import Data.List ( nub, intercalate ) import Data.Maybe ( catMaybes, listToMaybe, fromMaybe )-import System.FilePath.Posix ( (</>) )+import System.FilePath.Posix ( (</>), joinPath ) import System.Directory ( createDirectoryIfMissing, removeFile, doesFileExist, doesDirectoryExist, getDirectoryContents,

Page 2: bucketed_patche

getPermissions )hunk ./src/Darcs/Repository/Cache.hs 81 data WritableOrNot = Writable | NotWritable- deriving ( Show )+ deriving ( Eq, Show ) data CacheType = Repo | Directoryhunk ./src/Darcs/Repository/Cache.hs 153 -- | Compares two caches, a remote cache is greater than a local one. -- The order of the comparison is given by: local < http < ssh compareByLocality :: CacheLoc -> CacheLoc -> Ordering-compareByLocality (Cache _ _ x) (Cache _ _ y)+compareByLocality (Cache _ w x) (Cache _ z y) | isLocal x && isRemote y = LT | isRemote x && isLocal y = GT | isHttpUrl x && isSshUrl y = LThunk ./src/Darcs/Repository/Cache.hs 158 | isSshUrl x && isHttpUrl y = GT+ | isLocal x && isWritable w+ && isLocal y && isNotWritable z = LT | otherwise = EQ where isRemote r = isHttpUrl r || isSshUrl rhunk ./src/Darcs/Repository/Cache.hs 164 isLocal = isFile+ isWritable = (==) Writable+ isNotWritable = (==) NotWritable repo2cache :: String -> Cache repo2cache r = Ca [Cache Repo NotWritable r]hunk ./src/Darcs/Repository/Cache.hs 216 isThisRepo (Cache Repo Writable _) = True isThisRepo _ = False +bucketFolder :: String -> String+bucketFolder f = take 2 (cleanHash f)+ where+ cleanHash fileName = case dropWhile (/= '-') fileName of+ [] -> fileName+ s -> drop 1 s+ -- | @hashedFilePath cachelocation subdir hash@ returns the physical filename -- of hash @hash@ in the @subdir@ section of @cachelocation@. hashedFilePath :: CacheLoc -> HashedDir -> String -> Stringhunk ./src/Darcs/Repository/Cache.hs 227 hashedFilePath (Cache Directory _ d) s f =+ joinPath [d, hashedDir s, bucketFolder f, f]+hashedFilePath (Cache Repo _ r) s f =+ joinPath [r, darcsdir, hashedDir s, f]++-- | @hashedFilePathReadOnly cachelocation subdir hash@ returns the physical filename+-- of hash @hash@ in the @subdir@ section of @cachelocation@.+-- If directory, assume it is non-bucketed cache (old cache location).+hashedFilePathReadOnly :: CacheLoc -> HashedDir -> String -> String+hashedFilePathReadOnly (Cache Directory _ d) s f = d ++ "/" ++ hashedDir s ++ "/" ++ f

Page 3: bucketed_patche

hunk ./src/Darcs/Repository/Cache.hs 237-hashedFilePath (Cache Repo _ r) s f =+hashedFilePathReadOnly (Cache Repo _ r) s f = r ++ "/" ++ darcsdir ++ "/" ++ hashedDir s ++ "/" ++ f -- | @peekInCache cache subdir hash@ tells whether @cache@ and contains anhunk ./src/Darcs/Repository/Cache.hs 388 where ffuc (c : cs) | not (writable c) &&- (Anywhere == fromWhere || isFile (hashedFilePath c subdir f)) = do- let cacheFile = hashedFilePath c subdir f+ (Anywhere == fromWhere || isFile (hashedFilePathReadOnly c subdir f)) = do+ let cacheFile = hashedFilePathReadOnly c subdir f debugMessage $ "In fetchFileUsingCachePrivate I'm going manually" ++ " getting " ++ f ++ " from " ++ cacheFilehunk ./src/Darcs/Repository/HashedRepo.hs 19 {-# LANGUAGE CPP, ScopedTypeVariables #-} module Darcs.Repository.HashedRepo- ( revertTentativeChanges+ ( inventoriesDir+ , pristineDir+ , patchesDir+ , revertTentativeChanges , finalizeTentativeChanges , cleanPristine , filterDirContentshunk ./src/Darcs/Repository/HashedRepo.hs 143 inventoriesDir = "inventories" inventoriesDirPath = makeDarcsdirPath inventoriesDir -tentativePristinePath, pristineDirPath :: String+pristineDir, tentativePristinePath, pristineDirPath :: String tentativePristinePath = makeDarcsdirPath "tentative_pristine"hunk ./src/Darcs/Repository/HashedRepo.hs 145-pristineDirPath = makeDarcsdirPath "pristine.hashed"+pristineDir = "pristine.hashed"+pristineDirPath = makeDarcsdirPath pristineDir patchesDir, patchesDirPath :: String patchesDir = "patches"hunk ./src/Darcs/Repository/Prefs.hs 45 , globalCacheDir , globalPrefsDirDoc , globalPrefsDir+ , oldGlobalCacheDir ) where import Control.Exception ( catch )hunk ./src/Darcs/Repository/Prefs.hs 248 Nothing -> return [] globalCacheDir :: IO (Maybe FilePath)-globalCacheDir | windows = ((</> "cache") `fmap`) `fmap` globalPrefsDir+globalCacheDir | windows = ((</> "cache2") `fmap`) `fmap` globalPrefsDir | osx = ((</> "darcs") `fmap`) `fmap` osxCacheDir | otherwise = ((</> "darcs") `fmap`) `fmap` xdgCacheDir

Page 4: bucketed_patche

hunk ./src/Darcs/Repository/Prefs.hs 256 -- now ony used with read-only access. oldGlobalCacheDir :: IO (Maybe FilePath) oldGlobalCacheDir- | windows = return Nothing- | otherwise = do dir <- ((</> "cache") `fmap`) `fmap` globalPrefsDir case dir of Nothing -> return Nothinghunk ./src/Darcs/UI/Commands/Optimize.hs 24 import Control.Applicative ( (<$>) ) import Control.Exception ( finally )-import Control.Monad ( when, unless )+import Control.Monad ( when, unless, forM_ ) import Data.Maybe ( isJust ) import Data.List ( sort ) import System.Directoryhunk ./src/Darcs/UI/Commands/Optimize.hs 51 , workingRepoDir , umaskOption )-import Darcs.Repository.Prefs ( getPreflist, getCaches )+import Darcs.Repository.Prefs ( getPreflist, getCaches, globalCacheDir, oldGlobalCacheDir ) import Darcs.Repository ( Repository , withRepoLockhunk ./src/Darcs/UI/Commands/Optimize.hs 61 , cleanRepository , replacePristine )-import Darcs.Repository.HashedRepo ( filterDirContents, readHashedPristineRoot )+import Darcs.Repository.HashedRepo ( inventoriesDir, patchesDir, pristineDir,+ filterDirContents, readHashedPristineRoot ) import Darcs.Repository.Old ( oldRepoFailMsg ) import Darcs.Repository.InternalTypes ( Repository(..) ) import Darcs.Patch.Witnesses.Orderedhunk ./src/Darcs/UI/Commands/Optimize.hs 100 , (</>) , (<.>) , takeFileName+ , joinPath )hunk ./src/Darcs/UI/Commands/Optimize.hs 102+import Text.Printf ( printf )+import System.Posix.Files ( getFileStatus, isDirectory ) import Darcs.UI.Flags ( compression, verbosity, useCache, umask ) import Darcs.Repository.Flagshunk ./src/Darcs/UI/Commands/Optimize.hs 110 ( UpdateWorking (..), DryRun ( NoDryRun ), UseCache (..), UMask (..) , WithWorkingDir(WithWorkingDir) ) import Darcs.Patch.Progress ( progressFL )-import Darcs.Repository.Cache ( hashedDir, HashedDir(HashedPristineDir) )+import Darcs.Repository.Cache ( hashedDir, bucketFolder,+ HashedDir(HashedPristineDir) ) import Darcs.Repository.Format

Page 5: bucketed_patche

( identifyRepoFormat , createRepoFormathunk ./src/Darcs/UI/Commands/Optimize.hs 165 CommandData optimizeUncompress, CommandData optimizeRelink, CommandData optimizePristine,- CommandData optimizeUpgrade+ CommandData optimizeUpgrade,+ CommandData optimizeBucketed ] } hunk ./src/Darcs/UI/Commands/Optimize.hs 200 cleanRepository repository -- garbage collect pristine.hashed, inventories and patches directories putInfo opts "Done cleaning repository!" +optimizeBucketed :: DarcsCommand+optimizeBucketed = common+ { commandName = "cache"+ , commandHelp = "Migrate cache in old formats to bucketed cache."+ , commandDescription = "optimize global cache for access through shell commands like ls/rm"+ , commandCommand = optimizeBucketedCmd+ }+ optimizeUpgrade :: DarcsCommand optimizeUpgrade = common { commandName = "upgrade"hunk ./src/Darcs/UI/Commands/Optimize.hs 517 ex <- doesFileExist x when ex $ removeFile x +optimizeBucketedCmd :: (AbsolutePath, AbsolutePath) -> [DarcsFlag] -> [String] -> IO ()+optimizeBucketedCmd _ opts _ = do+ gOldCacheDir <- oldGlobalCacheDir+ gCacheDir <- globalCacheDir++ case gCacheDir of+ Just gCacheDir' -> do+ let gCachePristineDir = joinPath [gCacheDir', pristineDir]+ gCacheInventoriesDir = joinPath [gCacheDir', inventoriesDir]+ gCachePatchesDir = joinPath [gCacheDir', patchesDir]+ debugMessage "Making bucketed cache from new cache."+ toBucketed gCachePristineDir gCachePristineDir+ toBucketed gCacheInventoriesDir gCacheInventoriesDir+ toBucketed gCachePatchesDir gCachePatchesDir+ case gOldCacheDir of+ Just gOldCacheDir' -> do+ debugMessage "Making bucketed cache from old cache."+ toBucketed (joinPath [gOldCacheDir', pristineDir]) gCachePristineDir+ toBucketed (joinPath [gOldCacheDir', inventoriesDir]) gCacheInventoriesDir+ toBucketed (joinPath [gOldCacheDir', patchesDir]) gCachePatchesDir+ Nothing -> debugMessage "Old global cache doesn't exist."+ Nothing -> debugMessage "New global cache doesn't exist."+ putInfo opts "Done making bucketed cache!"+ where+ toBucketed :: FilePath -> FilePath -> IO ()

Page 6: bucketed_patche

+ toBucketed src dest = do+ debugMessage $ "Making " ++ src ++ " bucketed in " ++ dest+ forM_ subDirSet $ \subDir ->+ createDirectoryIfMissing True (dest </> subDir)+ createDirectoryIfMissing True src+ fileNames <- getDirectoryContents src+ forM_ fileNames $ \file -> do+ fileStatus <- getFileStatus (src </> file)+ if not $ isDirectory fileStatus+ then renameFile' src dest file+ else return ()++ renameFile' :: FilePath -> FilePath -> FilePath -> IO ()+ renameFile' s d f = renameFile (s </> f) (joinPath [d, bucketFolder f, f])++ subDirSet :: [String]+ subDirSet = map toStrHex [0..255]++ toStrHex :: Int -> String+ toStrHex = printf "%02x"+

Context:

[Documentation of function Darcs.Repository.Internal.misplacedPatches .Ale Gadea <[email protected]>**20140611054743 Ignore-this: 8c388563d9d269ffd723cf67aad3c839 misplacedPatches is the new name of the function called chooseOrder.] [remove patch index flags from rollback commandGuillaume Hoffmann <[email protected]>**20140612170904 Ignore-this: 342ccf1ce50dd76b5af334df2298f63] [resolve issue2396: make convert a supercommand and enhance help stringsGuillaume Hoffmann <[email protected]>**20140610210546 Ignore-this: 41493c28dab5a745b1d2c7107212724e] [Documentation for Darcs.Patch.Witnesses.Sealed module.Ale Gadea <[email protected]>**20140610190035 Ignore-this: 9618a5fadfff84f05fdbd1746345ca52] [Documentation for Darcs.Patch.Witnesses.Ordered module.Ale Gadea <[email protected]>**20140610185240 Ignore-this: e4f610d286b5caafdefc49ba83cb8c7d] [marksfile support for convert --exportGuillaume Hoffmann <[email protected]>**20140609190214 Ignore-this: 154f9a34ec7c65eaa64ab42462c60705] [conditionalise replHook using Template HaskellGanesh Sittampalam <[email protected]>**20140603061639 Ignore-this: 579c7064a7e74cce4bd992c527aa1ba] [Resolve Issue2361: optimize --reorder runs forever with one repositoryAle Gadea <[email protected]>**20140605210012 Ignore-this: b18bc37e5d6668df62f05679c629e08c] [factorize boilerplate of optimize subcommandsGuillaume Hoffmann <[email protected]>**20140605174035 Ignore-this: 13cc1ac6f5a70fd96815dca2ab12f76c

Page 7: bucketed_patche

] [resolve issue2394: make optimize a supercommandGuillaume Hoffmann <[email protected]>**20140603201207 Ignore-this: 841fbf0c5e0017eaff8b87b75ad80a37] [adapt testsuite to new optimize supercommandGuillaume Hoffmann <[email protected]>**20140603191624 Ignore-this: 83ae766c19b58c215adf9f4ec72f08bd] [Use DeriveFunctor instead of manual instance for PatchModOwen Stephens <[email protected]>**20140430023826 Ignore-this: 14f21fae95a81cdec9cf74c44b3d26c6] [indentation fix for Windows-only moduleGanesh Sittampalam <[email protected]>**20140530060239 Ignore-this: 4b53307163b44e11e9febcd8c49a7125 This is required by the Haskell2010 change (or we'd have to enable NondecreasingIndentation)] [fixed a typo in a comment in [email protected]**20140511122411 Ignore-this: ed08722291d80e441f601244ef15e7ab] [make optimize command respect --quiet by using [email protected]**20140511105622 Ignore-this: f4a139619c68470fe821929958a910b] [darcs.cabal: make Haskell2010 the default-language for all [email protected]**20140511102244 Ignore-this: 44afc0f4c5b5de0751edd2f99764dc20 This implied some more changes: we must demand cabal version >= 1.10, and change the extensions fields to default-extensions; in the implementation of some of the commands, needed to fix the indentation of do blocks.] [re-advertise GHC 7.8 supportGanesh Sittampalam <[email protected]>**20140529221736 Ignore-this: 457971dce069111801eeb10fa3b5668e] [clean out some obsolete conditionalisationGanesh Sittampalam <[email protected]>**20140427175352 Ignore-this: c1adbafd4846134551b407d9cb176e20] [bump version to 2.9.9Ganesh Sittampalam <[email protected]>**20140427175310 Ignore-this: 6f94d6f638e6f3fbc6025e02cbf095b1] [rm latex2html fileGuillaume Hoffmann <[email protected]>**20140519171502 Ignore-this: 1b9dbcca5c360bfda280325b78a9736f] [implement doFastZip to create zip archive from pristine treeGuillaume Hoffmann <[email protected]>**20140516190946 Ignore-this: 483ef7fffc417a4811d2d8cd90885e18] [resolve issue2364: don't break list of 'bad sources'Sergei Trofimovich <[email protected]>**20140513190719 Ignore-this: d68861b39305be9b7c77e9a6fca89601 This time the bug manifested on a simple operation:

Page 8: bucketed_patche

$ darcs record -a -m "something" Attempt to write a patch resulted in something like: Failed to record patch 'hello' HINT: I could not reach the following repositories: http://repetae.net/repos/jhc /home/st/.darcs/cache /home/st/.cache/darcs /home/st/dev/darcs/jhc If you're not using them, you should probably delete The sequence should be the following: 1. store patch to inventory/foo 2. try to store to a writable cache (say, ~/.darcs/cache/patches) 3. fail to write 4. filter out bad caches 5. try again 6. copy from cache to patches/ Due to missing NOINLINE step 4. led to all caches treated as writable, thus step 5 failed without a chance for patch to go to 'patches/'. As a side-effect building darcs with -O0 produced seemingly working darcs. Reported-by: Ivan Miljenovic] [Accept issue2386Owen Stephens <[email protected]>**20140511111624 Ignore-this: 1cb3f4f24b17211126a74f0c1870b3d9] [fixed missing option for author in rebase-nochanges.sh [email protected]**20140507192806 Ignore-this: d645456182ae0ae607af3acfa802c53c] [added _test_playground to boring [email protected]**20140506163344 Ignore-this: 924245bbd59fb4cf4538554755d2c18b] [added cabal sandbox generated stuff to [email protected]**20140502101628 Ignore-this: b665885b27d2ffc12bba5c18d1b8b42a] [avoid a syntax error with [email protected]**20140502101231 Ignore-this: 4b474731a3618ce0aeed031aa87b4526] [Accept issue2382: darcs is confused if a dir is moved inplace of a fileOwen Stephens <[email protected]>**20140427225132 Ignore-this: b4fef46928598bbc44c61325f6dc11b2] [Accept issue2383Owen Stephens <[email protected]>**20140429162716 Ignore-this: c994e1caeea21fe4db3db540c23c01fb] [Accept issue2380: darcs wont rename a file to the name of a deleted fileOwen Stephens <[email protected]>**20140422120300 Ignore-this: e80d38d9a8a99e5992f65c9728783a62]

Page 9: bucketed_patche

[Resolve Issue2244: darcs tag should warn about duplicate tagsAle Gadea <[email protected]>**20140507183109 Ignore-this: 8d25e1130bff79907d3db39d02f40197 Make darcs tag t, with t already an existing tag, cause a warning message.] [Use custom replHook to fix cabal replOwen Stephens <[email protected]>**20140503161130 Ignore-this: 6cc22760b85225d3d1ea101106ab7797] [resolve issue2364: fix file corruption on double fetchSergei Trofimovich <[email protected]>**20140429134020 Ignore-this: 9b069f38724673da4ff427f538132c24 The bug is the result of attempt to fetch the same file (say F) by the same URL (U) multiple times concurrently. First time U gets fetched by speculative prefetch logic. Second time as an ordinary file (while first fetch is not finished). The function 'copyUrlWithPriority' sends download request to 'urlChan' both times (it's already not a nice situation, fixed by this patch). Later urlThread satisfies first request, notifies receiver, and starts downloading exactly the same U again. I don't know exact data corruption mechanics yet, but it has to do with non-random intermediate file names of downloaded files and 'truncate' call when temp file is opened for a new downlaod job. All temp names are completely non-random for a single darcs run: urlThread :: Chan UrlRequest -> IO () urlThread ch = do junk <- flip showHex "" `fmap` randomRIO rrange evalStateT urlThread' (UrlState Map.empty emptyQ 0 junk) createDownloadFileName :: FilePath -> UrlState -> FilePath createDownloadFileName f st = f ++ "-new_" ++ randomJunk st My theory is next download manages to step on toes of previous job. I'll try to make file names truly random in other patch. That way such errors should manifest as read erros instead of data corruption. Thanks!] [Use DeriveFunctor instead of manual instance for PatchModOwen Stephens <[email protected]>**20140430003438 Ignore-this: 1a55af9820c6d5470443794ea0c71e91] [Allow options with path arguments to be specified in defaults [email protected]**20140404192946 Ignore-this: ff2ff9e6d96a06c4a8df29b3c9819b22] [fixed cut-n-paste error in haddock [email protected]**20140403224430 Ignore-this: 7e5b9fe7bbecbb52d8b639772b0fd208

Page 10: bucketed_patche

] [resolve issue2314: output-auto-name in defaults [email protected]**20140403170012 Ignore-this: 6dbd187b78bc2b108920cc0eaabfa5af] [update packs tests to new stringsGuillaume Hoffmann <[email protected]>**20140425211832 Ignore-this: d7123c21b273b91a2ce79007f70ee37f] [resolve issue1268: enable to write darcs init xGuillaume Hoffmann <[email protected]>**20140425200752 Ignore-this: 2586d59ba17f94b655c3a48df80b5d66 Original patch by Radoslav Dorcik ] [rename get to cloneGuillaume Hoffmann <[email protected]>**20140425175210 Ignore-this: 2c27cb2bc6a9978988386743b241c0b3] [remove Put since Get can clone to ssh destination fasterGuillaume Hoffmann <[email protected]>**20140425063225 Ignore-this: c0dbc05fc7511977381abba47728810a] [resolve issue1066: clone to ssh URL by locally cloning then copying by scpGuillaume Hoffmann <[email protected]>**20140425060647 Ignore-this: 2778bc4774fe8d5c53d0011b2193c1c2 Introduce an internal flag ForgetParent that enable to clone repositories while forgetting about their source (do not copy sources nor caches).] [do not tolerate ctrl+c when --complete is passedGuillaume Hoffmann <[email protected]>**20140425053440 Ignore-this: 856378499c61de4e4281d1ed966ede76] [do not print message twice when patches pack grabbing failsGuillaume Hoffmann <[email protected]>**20140424165929 Ignore-this: 148f8ca132725e81dd52d64f00cbd24b] [create inventories subdir at darcs initGuillaume Hoffmann <[email protected]>**20140421133938 Ignore-this: 5ace4beaf6c27c8a5105d74db977f61a] [TAG 2.9.9Ganesh Sittampalam <[email protected]>**20140424063828 Ignore-this: ae3cb4369f15af8cb2f19d6f5603d935] Patch bundle hash:445f563f0c49f11f2d596b275d82d60d7cb94e19