Skip to main content

Posts

Get Free Voucher for Azure Certification

If you wanted to measure/prove your skill for Microsoft Azure cloud free of cost then this blog is for you. How to get Azure Free voucher code   Microsoft is giving opportunity to prove your skill for azure cloud free of cost. You just need to follow below steps and you will get your unique free voucher, then you can schedule certification exam free of cost. Please note that this is limited time offer and voucher will be valid for 3 months. Go to Microsoft Azure events website:  click here List of event will display, among them click on Microsoft Azure Virtual Training Day: Fundamentals Slot may be full for particular day, need to traverse all the date to find available slot. Once you find available slot, then register. You need to attend that virtual event for selected slot, then you will get voucher code in 7 days in your email. You can use this code to schedule exam free of cost. My Exam Experience: I have cleared  AZ-204: Developing Solutions for Microsoft

Secure Azure Function App with Azure Active Directory (AD). [Token based access]

Welcome to BigDataStacks. This blog is regarding how we can secure azure function app with azure active directory. So when we will try to access function app it will ask for login . I also elaborate on how we can access the function URL with the access token .  Let's start.  Configure Function App Create an Azure Function app with anonymous access. Go to function app's 'Authentication / Authorization' section from 'Platform features'. Turn on App service Authentication/Authorization section. Select action 'Login with Azure AD' Click on Azure AD from Auth provider. Select 'Express' and 'create a new AD app' then click on OK. Click on 'Save'. Again open screen where we selected 'Express mode'. Now Select 'Advanced'. Copy 'clientId' which will be used later. NOTE: If clientId is not showing then refresh the page then it will display.  Add one more entry in 'Allowed Token Audi

AWS IOT Thing Job

AWS IOT Thing Job AWS Iot Thing Job Creation, new job notification, start job and update the job after downloading firmware through JAVA SDK with UI in JAVAFX | Presigned S3 URL creation This Application is made for firmware download. Refer to this GIT repository:  AWS IOT POC A repository contains 3 projects: Aws-Pre-Signed-Url-Generation: To generate presigned url and use it into job document. NOTE: AWS CLI should be configured Iot-Create-Thing-Job-App: To create iot thing job with UI. NOTE: Access key and secret key should be mentioned in aws-iot-sdk-samples.properties Iot-Start-Update-Thing-Job-App: To get notification for new job and to start job and then get job document from aws. After getting thing job document, it will download firmware zip from mention url and update the status of job to SUCCEDED or FAILED. NOTE: aws-iot-sdk-samples.properties files properties should be mention as per your aws account. JOB Document: sample-job-document.json { "ope

AWS IOT JITR (Just in Time registration) with Thing and Policy creation using JAVA

AWS IOT JITR with Thing and Policy creation using JAVA. This POC will provide Just In Time Registration (JITR) of custom certificate and Thing creation with connect policy for AWS IOT Devices. You just need to add name of thing in common name while creation of device certificate and thing will be created with attached policy & certificate and common name as thing name. Project Overview: Get certificate details from certificate id. Parse certificate details and get common name from certificate. Creates IOT policy having action of connect. Creates IOT thing with name from certificate common name. Attach policy and thing to certificate. Activate Certificate. Now your device can connect to AWS using this custom certificate. Step for JITR & Thing creation Create CA Certificate: openssl genrsa -out CACertificate.key 2048 openssl req -x509 -new -nodes -key CACertificate.key -sha256 -days 365 -out CACertificate.pem Enter necessary details like city, country, et

AWS Kinesis - Stream, Firehose, Analytics Overview

AWS Kinesis: AWS Kinesis is managed alternative of Apache Kafka. It can be used for big data real-time stream processing. It can be used for applications logs, metrics, forecast data, IoT. It can be used for streaming processing framework like Spark, NiFi, etc.   Kinesis Capabilities: Kinesis Streams : Streaming data ingest at scale with low latency. It is a data stream. Kinesis Analytics : To perform analytics on real-time streaming data using SQL. You can filter or aggregate data in real time. Kinesis Firehose : To load streams of data into S3, Redshift, Splunk or Elastic Search. It is a delivery stream. Kinesis Data Streams : Streams are divided into shards. To scale up application we can update shard configuration by increasing number of shards. By default shard's data can be retained for 1 Day but you can extend it for 7 days. Multiple application can use same stream. Real-time processing of data with a scale of throughput. Record size should not

Success Story on CCA 175 Exam - Jan 2019

Hello, Spark developers, As I have passed this exam on 3rd Jan 2019. I would like to share my experience in this exam and some exam related advice. How to register and start the Exam First of all, if you want to register for this exam then you should make an account in both Cloudera site and examslocal.com. You should have to pay $295 in Cloudera site and then you can schedule your exam from examslocal.com. You can reschedule an exam before 24 hours. Once you will schedule an exam in examslocal site, you will see how much time left for scheduled exam.  Before taking exam please make sure you have installed chrome with necessary chrome extension. You can test your system compatibility by pressing Compatibility test from an examslocal site. You can start your exam as early as 15 min before but not later than 15 min from scheduled time. You will see the Start Exam button 15 min before your scheduled exam. Information related to Exam's Question and Result As this exam

CCA 175 Preparation with this 6 practice questions- Try to solve it in 1 hour

As I found very few practice questions available on the internet for CCA 175 - Hadoop & spark developer exam. I set 6 questions exam with the solution provided in the comment section.  If you complete it in less than 1 hour then and then think to apply CCA 175 exam else you need more practice. Question's prerequisites : import data from orders table with parquet file format and save data to hdfs path: /user/rj_example/parquet_data/orders import data from customers table and save data to hdfs path: /user/rj_example/data/customers import data from customers table with avro file format and save data to hdfs path: /user/rj_example/avro_data/customers import data from customers table and save data to hdfs path: /user/rj_example/data/categories import data from products table and save data to hdfs path: /user/rj_example/data/products with '\t' as fields seperator create local dir 'rj_example' copy data from /user/rj_example/data/products to local dir