Checking Out Directory / File with SVNKit - java

I can't see on the wiki where checking out is documented. Ideally, I would like to check out a file "example/folder/file.xml", if not just the folder... and then when the application closes down or otherwise, be able to commit back in changes to this file. How do I do this?

As SVNKit developer, I would recommend you to prefer new API based on SvnOperationFactory. The old API (based on SVNClientManager) will be operational still but all new SVN features will come only to the new API.
final SvnOperationFactory svnOperationFactory = new SvnOperationFactory();
try {
final SvnCheckout checkout = svnOperationFactory.createCheckout();
checkout.setSingleTarget(SvnTarget.fromFile(workingCopyDirectory));
checkout.setSource(SvnTarget.fromURL(url));
//... other options
checkout.run();
} finally {
svnOperationFactory.dispose();
}

You cannot check out a file in Subversion. You have to check out a folder.
To check out a folder with one or more files:
SVNClientManager ourClientManager = SVNClientManager.newInstance(null,
repository.getAuthenticationManager());
SVNUpdateClient updateClient = ourClientManager.getUpdateClient();
updateClient.setIgnoreExternals(false);
updateClient.doCheckout(url, destPath, revision, revision,
isRecursive);
To commit a previously checked out folder:
SVNClientManager ourClientManager = SVNClientManager.newInstance(null,
repository.getAuthenticationManager());
ourClientManager.getWCClient().doInfo(wcPath, SVNRevision.HEAD);
ourClientManager.getCommitClient().doCommit
(new File[] { wcPath }, keepLocks, commitMessage, false, true);

I also used the code snippet proposed by Dmitry Pavlenko and I had no problems.
But it took nearly 30 minutes to checkout or update a repo struture of 35 MB.
It's not useable in my usecase (simply checking out a directory structure as part of the content/documents/media of a web application).
Or have I made some errors?
final ISVNAuthenticationManager authManager = SVNWCUtil.createDefaultAuthenticationManager(name, password);
final SVNURL svnUrl = SVNURL.create(url.getProtocol(), name, url.getHost(), 443, url.getPath(), true);
SVNRepository svnRepo= SVNRepositoryFactory.create(svnUrl);
svnRepo.setAuthenticationManager(authManager);
svnOperationFactory.setAuthenticationManager(authManager);
SVNDirEntry entry = svnRepo.info(".", -1);
long remoteRevision = entry.getRevision();
if (!workingCopyDirectory.exists()) {
workingCopyDirectory.mkdirs();
}
final SvnCheckout checkout = svnOperationFactory.createCheckout();
checkout.setSource(SvnTarget.fromURL(svnUrl));
checkout.setSingleTarget(SvnTarget.fromFile(workingCopyDirectory));
remoteRevision = checkout.run();

Related

Access Remote Git Repo

I want to access the remote repo files of all branches to analyze the committed code without cloning to the local through java. How can I achieve this one and what is the procedure if there any way to do?
Thanks in advance.
Try scm4j-vcs-git:
public static final String WORKSPACE_DIR = System.getProperty("java.io.tmpdir") + "git-workspaces";
...
IVCSWorkspace workspace = new VCSWorkspace(WORKSPACE_DIR);
String repoUrl = "https://github.com/MyUser/MyRepo";
IVCSRepositoryWorkspace repoWorkspace = workspace.getVCSRepositoryWorkspace(repoUrl);
IVCS vcs = new GitVCS(repoWorkspace);
vcs.setCredentials("user", "password"); // if necessary
vcs.getBranchesList();

How to check in a file into TFS using Java SDK

I am planning to integrate the TFS with another application using websevice.
I am new to TFS.so I downloaded the TFS Java SDK 2010.I have been writing s sample program to checkin file into TFS. but not successful. On internet also not much helpful post for Java side SDK samples.
Below is the code I have written:-
public static void main(String[] args) {
// TODO Auto-generated method stub
TFSTeamProjectCollection tpc = SnippetSettings.connectToTFS(); //got the connection to TFS
VersionControlClient vcc = tpc.getVersionControlClient();
//WorkspaceInfo wi = Workstation.Current.GetLocalWorkspaceInfo(Environment.CurrentDirectory);
//vcc.get
String[] paths =new String[1];
paths[0]="D:\\Tools\testfile.txt"; //wants to checkin this local file
Workspace ws = vcc.createWorkspace(null,"Testworkspacename3", null, "","Testcomment",null, null); // this is workspace created at path local C:\ProgramData\Microsoft Team Foundation Local Workspaces
int item = ws.pendAdd(paths, true, null, LockLevel.NONE, GetOptions.GET_ALL, PendChangesOptions.GET_LATEST_ON_CHECKOUT); // this line gives me 0 count. so this is problematic . 0 means nothing is being added.
PendingSet pd = ws.getPendingChanges();
PendingChange[] pendingChanges = pd.getPendingChanges();
ws.checkIn(pendingChanges, "samashti comment");
Project project = tpc.getWorkItemClient().getProjects().get(SnippetSettings.PROJECT_NAME);
System.out.println();
Please help here...what is the wrong here. Can some one provide me correct working sample for new file checkin and existing file checkin using JAVA.
Just refer these steps below:
Connect to team project collection
Get version control client
Create a new workspace
Add file to workspace
Get pending changes
Check in pending changes
Below are some links about TFS SDK for JAVA for your reference:
https://github.com/gocd/gocd/blob/master/tfs-impl/src/com/thoughtworks/go/tfssdk/TfsSDKCommand.java
https://github.com/jenkinsci/tfs-plugin/blob/master/src/main/java/hudson/plugins/tfs/commands/NewWorkspaceCommand.java
Please see the code snippet for creating and mapping workspace as per TFS-SDK-14.0.3
public static Workspace createAndMapWorkspace(final TFSTeamProjectCollection tpc) {
final String workspaceName = "SampleVCWorkspace" + System.currentTimeMillis(); //$NON-NLS-1$
Workspace workspace = null;
// Get the workspace
workspace = tpc.getVersionControlClient().tryGetWorkspace(ConsoleSettings.MAPPING_LOCAL_PATH);
// Create and map the workspace if it does not exist
if (workspace == null) {
workspace = tpc.getVersionControlClient().createWorkspace(
null,
workspaceName,
"Sample workspace comment", //$NON-NLS-1$
WorkspaceLocation.SERVER,
null,
WorkspacePermissionProfile.getPrivateProfile());
// Map the workspace
final WorkingFolder workingFolder = new WorkingFolder(
ConsoleSettings.MAPPING_SERVER_PATH,
LocalPath.canonicalize(ConsoleSettings.MAPPING_LOCAL_PATH));
workspace.createWorkingFolder(workingFolder);
}
System.out.println("Workspace '" + workspaceName + "' now exists and is mapped"); //$NON-NLS-1$ //$NON-NLS-2$
return workspace;
}

How to delete files from SVN using SVN kit jar

I need to delete a file in the svn repository by using svn kit jar.
I tried
SVNFileUtil.deleteFile(new File("URL"));
It does not throw any error. but am not able to delete the file which i given in url.
my code:
repository = SVNRepositoryFactory.create(SVNURL.parseURIDecoded(url));
//create authentication data
ISVNAuthenticationManager authManager =
SVNWCUtil.createDefaultAuthenticationManager(
prop.getProperty("SVNusername"),
prop.getProperty("SVNpassword"));
repository.setAuthenticationManager(authManager);
//need to identify latest revision
long latestRevision = repository.getLatestRevision();
System.out.println("Repository Latest Revision: " + latestRevision);
//create client manager and set authentication
SVNClientManager ourClientManager = SVNClientManager.newInstance();
ourClientManager.setAuthenticationManager(authManager);
//use SVNUpdateClient to do the export
SVNCommitClient commitClient = ourClientManager.getCommitClient();
commitClient.setIgnoreExternals(false);
SVNFileUtil.deleteFile(new File(urln));
SVNCommitClient client = new SVNCommitClient(authManager, null);
SVNCommitInfo info;
#manuelcr is right, and alternatively you can use high level code:
final SvnOperationFactory svnOperationFactory = new SvnOperationFactory();
try {
final SvnRemoteDelete remoteDelete = svnOperationFactory.createRemoteDelete();
remoteDelete.setSingleTarget(SvnTarget.fromURL(fileUrl));
remoteDelete.setCommitMessage("Delete a file from the repository");
final SVNCommitInfo commitInfo = remoteDelete.run();
if (commitInfo != null) {
final long newRevision = commitInfo.getNewRevision();
System.out.println("Removed a file, revision " + newRevision + " created");
}
} finally {
svnOperationFactory.dispose();
}
Have you tried using a CommitEditor?
In this link you have a description about how get one from your SVNRepository and use it for deleting an entry.

SVNKit to find diff between two files stored at separate locations with separate revision numbers

I am writing a Java program using the SVNKit API, and I need to use the correct class or call in the API that would allow me to find the diff between files stored in separate locations.
1st file:
https://abc.edc.xyz.corp/svn/di-edc/tags/ab-cde-fgh-axsym-1.0.0/src/site/apt/releaseNotes.apt
2nd file:
https://abc.edc.xyz.corp/svn/di-edc/tags/ab-cde-fgh-axsym-1.1.0/src/site/apt/releaseNotes.apt
I have used the listed API calls to generate the diff output, but I am unsuccessful so far.
DefaultSVNDiffGenerator diffGenerator = new DefaultSVNDiffGenerator();
diffGenerator.displayFileDiff("", file1, file2, "10983", "8971", "text", "text/plain", output);
diffClient.doDiff(svnUrl1, SVNRevision.create(10868), svnUrl2, SVNRevision.create(8971), SVNDepth.IMMEDIATES, false, System.out);
Can anyone provide guidance on the correct way to do this?
Your code looks correct. But prefer using the new API:
final SvnOperationFactory svnOperationFactory = new SvnOperationFactory();
try {
final ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
final SvnDiffGenerator diffGenerator = new SvnDiffGenerator();
diffGenerator.setBasePath(new File(""));
final SvnDiff diff = svnOperationFactory.createDiff();
diff.setSources(SvnTarget.fromURL(url1, svnRevision1), SvnTarget.fromURL(url2, svnRevision1));
diff.setDiffGenerator(diffGenerator);
diff.setOutput(byteArrayOutputStream);
diff.run();
} finally {
svnOperationFactory.dispose();
}

BIRT Error : Unable to determine the default workspace location in Java

I get the following error
java.lang.IllegalStateException: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
which makes little sense.
Reports are created using the BIRT designer within Eclipse, and we are using code to covert the reports in to PDF.
the code looks something like
final EngineConfig config = new EngineConfig();
config.setBIRTHome("./birt");
Platform.startup(config);
final IReportEngineFactory factory = (IReportEngineFactory) Platform
.createFactoryObject(IReportEngineFactory.EXTENSION_REPORT_ENGINE_FACTORY);
final HTMLRenderOption ho = new HTMLRenderOption();
ho.setImageHandler(new HTMLCompleteImageHandler());
config.setEmitterConfiguration(RenderOption.OUTPUT_FORMAT_HTML, ho);
// Create the engine.
this.engine = factory.createReportEngine(config);
final IReportRunnable report = this.engine.openReportDesign(reportName);
final IRunAndRenderTask task = this.engine.createRunAndRenderTask(report);
final RenderOption options = new HMTLRenderOption();
options.setOutputFormat(HTMLRenderOption.OUTPUT_FORMAT_PDF);
options.setOutputFormat("pdf");
final String output = reportName.replaceFirst(".rptdesign", ".xls");
final String output = name.replaceFirst(".rptdesign", "." + HTMLRenderOption.OUTPUT_FORMAT_PDF);
options.setOutputFileName( outputReporttName);
task.setRenderOption(options);
// Run the report.
task.run();
but it seems during the task.run() method, the system throws the error.
This needs to be able to run standalone, without the need of eclipse, and hopped thatt he setting of BIRT home would make it happy, but these seems to be some other connection profile i am unaware of and probably don't need.
The full error :
07-Jan-2013 14:55:31 org.eclipse.datatools.connectivity.internal.ConnectivityPlugin log
SEVERE: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
07-Jan-2013 14:55:31 org.eclipse.birt.report.engine.api.impl.EngineTask handleFatalExceptions
SEVERE: An error happened while running the report. Cause:
java.lang.IllegalStateException: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
at org.eclipse.datatools.connectivity.internal.ConnectivityPlugin.getDefaultStateLocation(ConnectivityPlugin.java:155)
at org.eclipse.datatools.connectivity.internal.ConnectivityPlugin.getStorageLocation(ConnectivityPlugin.java:191)
at org.eclipse.datatools.connectivity.internal.ConnectionProfileMgmt.getStorageLocation(ConnectionProfileMgmt.java:1060)
at org.eclipse.datatools.connectivity.oda.profile.internal.OdaProfileFactory.defaultProfileStoreFile(OdaProfileFactory.java:170)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.defaultProfileStoreFile(OdaProfileExplorer.java:138)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.loadProfiles(OdaProfileExplorer.java:292)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.getProfileByName(OdaProfileExplorer.java:537)
at org.eclipse.datatools.connectivity.oda.profile.provider.ProfilePropertyProviderImpl.getConnectionProfileImpl(ProfilePropertyProviderImpl.java:184)
at org.eclipse.datatools.connectivity.oda.profile.provider.ProfilePropertyProviderImpl.getDataSourceProperties(ProfilePropertyProviderImpl.java:64)
at org.eclipse.datatools.connectivity.oda.consumer.helper.ConnectionPropertyHandler.getEffectiveProperties(ConnectionPropertyHandler.java:123)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.getEffectiveProperties(OdaConnection.java:826)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.open(OdaConnection.java:240)
at org.eclipse.birt.data.engine.odaconsumer.ConnectionManager.openConnection(ConnectionManager.java:165)
at org.eclipse.birt.data.engine.executor.DataSource.newConnection(DataSource.java:224)
at org.eclipse.birt.data.engine.executor.DataSource.open(DataSource.java:212)
at org.eclipse.birt.data.engine.impl.DataSourceRuntime.openOdiDataSource(DataSourceRuntime.java:217)
at org.eclipse.birt.data.engine.impl.QueryExecutor.openDataSource(QueryExecutor.java:407)
at org.eclipse.birt.data.engine.impl.QueryExecutor.prepareExecution(QueryExecutor.java:317)
at org.eclipse.birt.data.engine.impl.PreparedQuery.doPrepare(PreparedQuery.java:455)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.produceQueryResults(PreparedDataSourceQuery.java:190)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.execute(PreparedDataSourceQuery.java:178)
at org.eclipse.birt.data.engine.impl.PreparedOdaDSQuery.execute(PreparedOdaDSQuery.java:145)
at org.eclipse.birt.report.data.adapter.impl.DataRequestSessionImpl.execute(DataRequestSessionImpl.java:624)
at org.eclipse.birt.report.engine.data.dte.DteDataEngine.doExecuteQuery(DteDataEngine.java:152)
at org.eclipse.birt.report.engine.data.dte.AbstractDataEngine.execute(AbstractDataEngine.java:267)
at org.eclipse.birt.report.engine.executor.ExecutionContext.executeQuery(ExecutionContext.java:1939)
at org.eclipse.birt.report.engine.executor.QueryItemExecutor.executeQuery(QueryItemExecutor.java:80)
at org.eclipse.birt.report.engine.executor.TableItemExecutor.execute(TableItemExecutor.java:62)
at org.eclipse.birt.report.engine.internal.executor.dup.SuppressDuplicateItemExecutor.execute(SuppressDuplicateItemExecutor.java:43)
at org.eclipse.birt.report.engine.internal.executor.wrap.WrappedReportItemExecutor.execute(WrappedReportItemExecutor.java:46)
at org.eclipse.birt.report.engine.internal.executor.l18n.LocalizedReportItemExecutor.execute(LocalizedReportItemExecutor.java:34)
at org.eclipse.birt.report.engine.layout.html.HTMLBlockStackingLM.layoutNodes(HTMLBlockStackingLM.java:65)
at org.eclipse.birt.report.engine.layout.html.HTMLPageLM.layout(HTMLPageLM.java:92)
at org.eclipse.birt.report.engine.layout.html.HTMLReportLayoutEngine.layout(HTMLReportLayoutEngine.java:100)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.doRun(RunAndRenderTask.java:180)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.run (RunAndRenderTask.java:77)
has anyone seen this error and can point me in the right direction ?
When I had this issue then I tried two things. The first thing solved the error but then I just got to the next error.
The first thing I tried was setting the setenv.sh file to have the following line:
export CATALINA_OPTS="$CATALINA_OPTS -Djava.io.tmpdir=/opt/local/share/tomcat/apache-tomcat-8.0.8/temp/tmpdir -Dorg.eclipse.datatools_workspacepath=/opt/local/share/tomcat/apache-tomcat-8.0.8/temp/tmpdir/workspace_dtp"
This solution worked after I made the tmpdir and the workspace_dtp directories in my local tomcat server. This was done in response to the guidance here.
However, I just got to the next error, which was a connection profile error. I can look into it again if you need. I know how to replicate the issue.
The second thing I tried ended up solving the issue completely and had to do with our report designer selecting the wrong type of datasource in the report design process. See my post on the Eclipse BIRT forums here for the full story: post.
Basically, the report type was set to "JDBC Database Connection for Query Builder" when it should have been set to "JDBC Data Source." See the picture for reference:
Here I give you a tip that save me from that pain :
just launch eclipse with "-clean" option after installing BIRT plugins.
To be clear, my project was built from BIRT maven dependencies, and so should not use eclipse dependencies to run (except for designing reports), but ... i think there was a conflict somewhere ... especially with org.eclipse.datatools.connectivity_1.2.4.v201202041105.jar
For global understanding, you should follow the migration guide :
http://wiki.eclipse.org/Birt_3.7_Migration_Guide#Connection_Profiles
It helps using a connection profile to externalize datasource parameters.
So it's not required if you define JDBC parameters directly in report design.
I used this programmatic way to initialize worskpace directory :
#Override
public void initializeEngine() throws BirtException {
// define eclipse datatools workspace path (required)
String workspacePath = setDataToolsWorkspacePath();
// set configuration
final EngineConfig config = new EngineConfig();
config.setLogConfig(workspacePath, Level.WARNING);
// config.setResourcePath(getSqlDriverClassJarPath());
// startup OSGi framework
Platform.startup(config); // really needed ?
IReportEngineFactory factory = (IReportEngineFactory) Platform
.createFactoryObject(IReportEngineFactory.EXTENSION_REPORT_ENGINE_FACTORY);
engine = factory.createReportEngine(config);
engine.changeLogLevel(Level.WARNING);
}
private String setDataToolsWorkspacePath() {
String workspacePath = System.getProperty(DATATOOLS_WORKSPACE_PATH);
if (workspacePath == null) {
workspacePath = FilenameUtils.concat(SystemUtils.getJavaIoTmpDir().getAbsolutePath(), "workspace_dtp");
File workspaceDir = new File(workspacePath);
if (!workspaceDir.exists()) {
workspaceDir.mkdir();
}
if (!workspaceDir.canWrite()) {
workspaceDir.setWritable(true);
}
System.setProperty(DATATOOLS_WORKSPACE_PATH, workspacePath);
}
return workspacePath;
}
I also needed to force datasource parameters at runtime this way :
private void generateReportOutput(InputStream reportDesignInStream, File outputFile, OUTPUT_FORMAT outputFormat,
Map<PARAM, Object> params) throws EngineException, SemanticException {
// Open a report design
IReportRunnable design = engine.openReportDesign(reportDesignInStream);
// Use data-source properties from persistence.xml
forceDataSource(design);
// Create RunAndRender task
IRunAndRenderTask runTask = engine.createRunAndRenderTask(design);
// Use data-source from JPA persistence context
// forceDataSourceConnection(runTask);
// Define report parameters
defineReportParameters(runTask, params);
// Set render options
runTask.setRenderOption(getRenderOptions(outputFile, outputFormat, params));
// Execute task
runTask.run();
}
private void forceDataSource(IReportRunnable runableReport) throws SemanticException {
DesignElementHandle designHandle = runableReport.getDesignHandle();
Map<String, String> persistenceProperties = PersistenceUtils.getPersistenceProperties();
String dsURL = persistenceProperties.get(AvailableSettings.JDBC_URL);
String dsDatabase = StringUtils.substringAfterLast(dsURL, "/");
String dsUser = persistenceProperties.get(AvailableSettings.JDBC_USER);
String dsPass = persistenceProperties.get(AvailableSettings.JDBC_PASSWORD);
String dsDriver = persistenceProperties.get(AvailableSettings.JDBC_DRIVER);
SlotHandle dataSources = ((ReportDesignHandle) designHandle).getDataSources();
int count = dataSources.getCount();
for (int i = 0; i < count; i++) {
DesignElementHandle dsHandle = dataSources.get(i);
if (dsHandle != null && dsHandle instanceof OdaDataSourceHandle) {
// replace connection properties from persistence.xml
dsHandle.setProperty("databaseName", dsDatabase);
dsHandle.setProperty("username", dsUser);
dsHandle.setProperty("password", dsPass);
dsHandle.setProperty("URL", dsURL);
dsHandle.setProperty("driverClass", dsDriver);
dsHandle.setProperty("jarList", getSqlDriverClassJarPath());
// #SuppressWarnings("unchecked")
// List<ExtendedProperty> privateProperties = (List<ExtendedProperty>) dsHandle
// .getProperty("privateDriverProperties");
// for (ExtendedProperty extProp : privateProperties) {
// if ("odaUser".equals(extProp.getName())) {
// extProp.setValue(dsUser);
// }
// }
}
}
}
I was having the same issue
Changing the Data Source type from "JDBC Database Connection for Query Builder" to "JDBC Data Source" solved the problem for me.

Categories

Resources