Saturday, 15 February 2014

c# - Unexpected behavior of upload to SharePoint 2013 via OData if column value uniqueness constraint is violated -



c# - Unexpected behavior of upload to SharePoint 2013 via OData if column value uniqueness constraint is violated -

i upload documents sharepoint 2010 , 2013 odata service listdata.svc proxied visual studio-generated service reference. code performing task similar c# snippet below:

public static void upload<t>(string libraryname, string localfilepath, string targetlibraryrelativepath, t document) { string fulllibrarypath = contextfactory.root + libraryname + targetlibraryrelativepath); var ctx = contextfactory.getodatacontext(); fileinfo fi = new fileinfo(localfilepath); document.path = fulllibrarypath ; document.name = fi.name; document.contenttype = "document"; var mimetype = derivemimetype(localfilepath); using (filestream sourcefile = system.io.file.open(localfilepath, filemode.open, fileaccess.read)) { ctx.addobject(libraryname, document); ctx.setsavestream(document, sourcefile, true, mimetype, fulllibrarypath + "/" + document.name); ctx.savechanges(); } }

recently new business requirement has surfaced: prevent document replicas getting library. first approach came mind introducing new column mandatory unique value , place there md5 hash:

document.md5 = calculatemd5hash(file.readallbytes(sourcefile));

i expected if new upload same md5 hash attempted service must prevent document contents duplicate reaching library virtue of unique hash column value constraint.

my expectations came out partially correct: while service throws exception on such effort indeed, nevertheless duplicate contents ends uploaded, lacking attributes of original document , hurting way integrity of document library.

while not problem alter implementation bit , check document library equal md5 hash value existence prior calling setsavestream feels excessive effort totally defeats purpose of declaring unique constraint md5 hash value column.

my question is: missing here , there way create scheme perform fulfillment of document contents uniqueness requirement me without penalty of placing inconsistent items document library?

the best alternatives can think of moment are:

checking md5 hash existence in library before trying add together document (as suggested in op):

var md5 = getmd5hash(file.readallbytes(fi.fullname)); // check md5 hash doesn't exist in library if (ctx.test.where(i => i.md5 == md5).count() == 0) { seek { document.path = fulllibrarypath; document.name = fi.name; document.contenttype = "document"; var mimetype = derivemimetype(localfilepath); using (filestream sourcefile = system.io.file.open(localfilepath, filemode.open, fileaccess.read)) { ctx.addobject(libraryname, document); ctx.setsavestream(document, sourcefile, true, mimetype, fulllibrarypath + "/" + document.name); ctx.savechanges(); } } grab (dataservicerequestexception) { // todo: log exception } }

using try{...}catch{...} statement in order grab exception , delete added item:

using (filestream sourcefile = system.io.file.open(localfilepath, filemode.open, fileaccess.read)) { ctx.addobject(libraryname, document); ctx.setsavestream(document, sourcefile, true, mimetype, fulllibrarypath + "/" + document.name); seek { ctx.savechanges(); } grab (exception) { _ctx.deleteobject(document); _ctx.savechanges(); // todo: log exception } }

c# sharepoint odata

No comments:

Post a Comment