I have Azure Function project in JAVA. Unfortunately, java is not supported very well )-: So, everything is a "little" bit different. So, could you point me to reference example or documentation how to deploy a function project written in java to azure? Since all what I want shows just part of the problem - and the parts does not fit together )-:
java uses azure-function-maven-plugin (which is wrapper to func core tools)
this plugin prepares stagging folder which is compressed to ZIP and deployed as "package"
unfortunately, this stagging folder is named based of function name. So, ZIP package IS DEPENDENT on target azure RESOURCE name.
It is impossible to build independent package (ZIP) and deploy it to several different environment (stages/dev/test/prod). Or is it?
It is especially wierd when you use CI/CD pipeline. It is not possible to have one BUILD pipeline and than several DEPLOY pipeline. Because the build HAVE TO be named (internal directories) based on target deployment name - so not independent. Which goes against the basic principle to have one build and several configuration for each environment.
Any idea how to solve it without building several builds? Thank you.
EDIT:
maven "mvn package (+azure-function:package)" prepare build with directories
${projectRoot}/target/azure-functions/${functionResourceName}/...
where
/...
is compressed to final azure package named: ${functionResourceName}.zip
So, the "functionResourceName" is just in the name of ZIP file (+ containing jar with the same name). But ...
... if you try deploy this ZIP to azure function resource with another name - it fails.
Yes. I indeed manually prepare the package(using Publish Build Artifacts Task.)
I would like to share my steps to deploy the Java Function Package to Azure Function.
Here are my steps:
In Build Pipeline:
steps:
- task: Maven#3
displayName: 'Maven pom.xml'
inputs:
mavenPomFile: '$(Parameters.mavenPOMFile)'
options: 'azure-functions:package'
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(system.defaultworkingdirectory)'
Contents: '**/azure-functions/**'
TargetFolder: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
- task: ArchiveFiles#2
displayName: 'Archive $(Build.ArtifactStagingDirectory)/target/azure-functions/kishazureappfunction'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)/target/azure-functions/kishazureappfunction'
includeRootFolder: false
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
In Release Pipeline:
I use the Azure App Service Deploy Task. (For clear, I converted it to yaml format)
- task: AzureRmWebAppDeployment#4
displayName: 'Azure App Service Deploy: kevin1014'
inputs:
azureSubscription: kevintest
appType: functionApp
WebAppName: kevin1014
packageForLinux: '$(System.DefaultWorkingDirectory)/_123-Maven-CI/drop/1.zip'
enableCustomDeployment: true
DeploymentType: runFromZip
Result:
The azure function name and the package name is different. But it could be deployed to Azure Function successfully.
Related
I'm currently building a Java Lambda function as a Gradle project. The application is built using the command gradlew build. This both tests ad builds the application JAR.
The definition for my function resembles the following snippet from the SAM template.yml.
ImageResizeLambda:
Type: AWS::Serverless::Function
Properties:
CodeUri: thumbor
Handler: com.nosto.imagevector.resizer.ResizeController::handleRequest
Runtime: java11
Events:
ResizeImage:
Type: Api
Properties:
Path: /quick/{accountId}/{thumbnailVersion}/{imageId}/{hash}/
Method: GET
Policies:
S3ReadPolicy:
BucketName: !Ref ImageBucket
Metadata:
BuildMethod: makefile
As shown here, the build method is defined as makefile. The Makefile inside my Gradle project (which seems really) unnecessary has this bit:
CUR_DIR := $(abspath $(patsubst %/,%,$(dir $(abspath $(lastword $(MAKEFILE_LIST))))))
build-ImageNormalizeLambda:
cd $(CUR_DIR)/.. && ./gradlew normaliser:clean normaliser:buildZip
cd $(CUR_DIR) && unzip build/distributions/normaliser.zip -d $(ARTIFACTS_DIR)
Because my project contains multiple projects, there's a Gradle subproject for every function.
While this seemingly works, I've glued this together and seems to add yet another DSL to the already-complex build.
Based on this project, I can see some work on a Gradle builder but I'm unsure how/if this transitive dependency is used. https://github.com/aws/aws-lambda-builders/tree/develop/aws_lambda_builders/workflows/java_gradle
Is there a better way to build a Gradle-based Java Lambda function as compared to this approach? The documentation for sam build doesn't go very deep.
I would like to make a build pipeline in Azure DevOps including tests/code coverage.
For that, I created a very basic Java project:
package main:
- main class
- Calculator class
- add method
package test:
- CalculatorTest class
- addTest method
It's very basic, just for me to understand how test in pipeline work. I don't use maven or things like that. For the tests, I'm using JUnit framework.
In Azure DevOps pipeline, I imported my project from Github, and started to create the pipeline. I start from the starter template, which contains:
trigger:
- master
pool:
vmImage: 'Ubuntu-16.04'
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
My question is:
What do I have to do to run my tests automatically ?
I've seen several examples on the Microsoft documentation but it was always for "complex" projects (like with maven etc.). And ass I'm new with Azure DevOps and YAML file/syntax, I'm lost.
I want to run my tests after each commit, and see the results (test + code coverage) in the pipeline summary, like it is described here : https://learn.microsoft.com/en-us/azure/devops/pipelines/test/review-continuous-test-results-after-build?view=azure-devops#view-test-results-in-build
Thanks a lot.
PS: For the moment I'm just focusing on tests but once it will be done I also would like to publish build artefacts. I would like the confirmation of that:
- task: PublishBuildArtifacts#1
Is that line correct ?
EDIT
The line - task: PublishBuildArtifacts#1 seems to work correclty but I have the following warning:
Directory '/home/vsts/work/1/a' is empty. Nothing will be added to build artifact 'drop'.
What does it mean ?
Finally I used the visual designer (like explained here: https://learn.microsoft.com/en-US/azure/iot-edge/how-to-ci-cd) and I added the Maven task.
I upgraded my project to use Maven, which is well integrated in Azure Devops.
First of all some background:
I'm currently refactoring our application for Internal Tools and Scripts, and got into 'beauty problems' some would call it.
The base structure of the application should not include the scripts that are being created, those will be downloaded by the application on request. Imagine it like an AppStore for Internal Tools.
But to give developers in the new repository the Option to create those scripts, with auto completion, I'm searching for a way to include an external folder to the indexed files in IntelliJ, that doesn't add the whole folder to the compile task.
Folder structure is like that:
(1) ./ScriptSuite/scripts/
(2) ./src/main/
Where in (2) are all the backend sources, including Utilities and database connections etc. And in (1) are groovy script files, that can be loaded dynamically on request.
I want to avoid having the scripts in (2) cause this led to confusion earlier, because sometimes we couldn't differ if it's a script compiled at app start or a script downloaded from our server after app start.
I tried adding (1) to the build.gradle via:
sourceSets {
main {
groovy {
srcDirs {
'src/main/groovy'
}
srcDir {
'ScriptSuite/scripts'
}
}
}
}
But then the Files would be included in the compilation at App Start, which I'm trying to avoid, since those should be compiled at runtime by the GroovyScriptEngine. Also (1) is excluded from push into the repository, we host those on our S3 Bucket and other tools are in place for version control.
What did work, was adding (1) as a module source folder in IntelliJ itself, but this is only client side and won't get pushed into the repository, so everyone would have to configure it for himself (not good).
Any idea how to solve this problem with gradle ? I appreciate every help or tip !
I followed the documentation for using app.yaml with Java which claims that this should work and that it will generate web.xml and appengine-web.xml automatically. However, it doesn't seem to work and it doesn't mention which tool will generate the files.
I first tried a mvn clean install which errors out because the .xml files are missing:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-war-plugin:2.4:war (default-war) on project roger-analytics: Error assembling WAR: webxml attribute is required (or pre-existing WEB-INF/web.xml if executing in update mode) -> [Help 1]
I then tried to run the local development server:
$ gcloud preview app run app.yaml
ERROR: (gcloud.preview.app.run) An error occurred while parsing file: [/Users/blixt/src/roger-api/module_analytics/app.yaml]
Unexpected attribute 'servlet' for object of type URLMap.
in "/Users/blixt/src/roger-api/module_analytics/app.yaml", line 7, column 12
(I get the same error from dev_appserver.py . by the way)
It appears that app.yaml isn't supported after all. Am I missing something, or was support removed without updating the documentation?
Here's my app.yaml file, which is intended to run as a module in my Google Cloud App Engine project (along with other modules that have Python and Go runtimes):
module: analytics
runtime: java
api_version: 1
handlers:
- url: /*
servlet: im.rgr.roger.RogerAnalytics
login: admin
secure: always
system_properties:
java.util.logging.config.file: WEB-INF/logging.properties
There are several issues at play here. I'll describe various facts that group together to create a constellation of SDK edge-case goodness (this information is current as of SDK 1.9.21):
In order to deploy using the Java SDK's appcfg.sh, you'll need to have app.yaml inside the war/WEB-INF/ folder.
appcfg.py complains Unexpected attribute 'servlet' for object of type URLMap..
gcloud preview app deploy uses appcfg.py (or the same codebase) and therefore also complains in the same manner
So, in conclusion, you'll need to use appcfg.sh
I can't run vertx module for eclipse project on windows 7
I have followed the instructions here: http://vertx.io/gradle_dev.html
download the template https://github.com/vert-x/vertx-gradle-template
run the tests
cd vertx-gradle-template-master
gradlew.bat test
BUILD SUCCESSFUL
setup the ide
gradlew.bat eclipse
BUILD SUCCESSFUL
trying to run module
gradlew.bat runMod
I got this:
:collectDeps UP-TO-DATE
:runMod
Module directory build\mods\com.mycompany~my-module~1.0.0-final already exists. Creating properties on demand (a.k.a. dynamic properties) has been deprecated and is scheduled to be removed in Gradle 2.0. Please read http://gradle.org/docs/current/dsl/org.gradle.api.plugins.ExtraPropertiesExtension.html for information
on the replacement for dynamic properties.
Deprecated dynamic property: "args" on "task ':runMod'", value: "[runmod, com.mycompany...".
Building 50% > :runMod
What I should do with this? I don't understand.
Actually, you got everything working!
The dev guide [0] provides the extra information you need. According to it, Gradle swallows INFO-level messages from the program. If you do it again with '-i', then you will see the missing output that actually indicates things are working as expected.
Here's what I see when I run ./gradlew runmod -i (note that case doesn't matter, this works same as using 'runMod'), where you can see the log message from the PingVerticle class indicating it's waiting for ping messages:
...Same output as from the question...
PingVerticle started
Succeeded in deploying module
> Building 50% > :runMod
Unfortunately, the documentation is sorely lacking and the PingVerticle will just sit there indefinitely until you actually send a ping message yourself.
[0] http://vertx.io/dev_guide.html