Skip to main content

Success Story on CCA 175 Exam - Jan 2019

Hello, Spark developers, As I have passed this exam on 3rd Jan 2019. I would like to share my experience in this exam and some exam related advice.

How to register and start the Exam

  • First of all, if you want to register for this exam then you should make an account in both Cloudera site and examslocal.com. You should have to pay $295 in Cloudera site and then you can schedule your exam from examslocal.com. You can reschedule an exam before 24 hours.
  • Once you will schedule an exam in examslocal site, you will see how much time left for scheduled exam. 
  • Before taking exam please make sure you have installed chrome with necessary chrome extension. You can test your system compatibility by pressing Compatibility test from an examslocal site.
  • You can start your exam as early as 15 min before but not later than 15 min from scheduled time. You will see the Start Exam button 15 min before your scheduled exam.
Information related to Exam's Question and Result
  • As this exam will not have MCQ type questions, but you have to write program related to spark, hive, sqoop and flume and save the output to the specified directory. 
  • Questions will not too much hard but it will be tricky that might be confusing you. 
  • This exam will not focus on logic and solution but output oriented i.e output should be matched with their output. If your output is wrong then they will give the reason in the mail of score result like type of files different or number of records are different.
  • You will get mail of result in 30 to 40 min of your exam completion.
Exam Environment
  • This exam is not arranged for any exam centres.
  • You can give an exam from a laptop or any PC having an external webcam but make sure to check the compatibility of your system by compatibility test. I used a desktop PC with an external webcam.
  • There is no provision to see how much time is left but you can ask time to proctor from live chat option.
  •  You have to click End Exam button after completion of an exam from the header. 
  • You have to allow use webcam and screen sharing option from website header.
Exam Preparation Advice

I recommend complete the following practice question before attempting the actual exam.
Second advice is to practice this question papers blogs as much as possible repeatedly.
Practice...Practice... and Practice then you will be surely clear this exam.

All the best for exam.

Thanks & Best Regards.
Rajan Patel

Comments

Post a Comment

Popular posts from this blog

AWS IOT Thing Job

AWS IOT Thing Job AWS Iot Thing Job Creation, new job notification, start job and update the job after downloading firmware through JAVA SDK with UI in JAVAFX | Presigned S3 URL creation This Application is made for firmware download. Refer to this GIT repository:  AWS IOT POC A repository contains 3 projects: Aws-Pre-Signed-Url-Generation: To generate presigned url and use it into job document. NOTE: AWS CLI should be configured Iot-Create-Thing-Job-App: To create iot thing job with UI. NOTE: Access key and secret key should be mentioned in aws-iot-sdk-samples.properties Iot-Start-Update-Thing-Job-App: To get notification for new job and to start job and then get job document from aws. After getting thing job document, it will download firmware zip from mention url and update the status of job to SUCCEDED or FAILED. NOTE: aws-iot-sdk-samples.properties files properties should be mention as per your aws account. JOB Document: sample-job-document.json { "ope

AWS Kinesis - Stream, Firehose, Analytics Overview

AWS Kinesis: AWS Kinesis is managed alternative of Apache Kafka. It can be used for big data real-time stream processing. It can be used for applications logs, metrics, forecast data, IoT. It can be used for streaming processing framework like Spark, NiFi, etc.   Kinesis Capabilities: Kinesis Streams : Streaming data ingest at scale with low latency. It is a data stream. Kinesis Analytics : To perform analytics on real-time streaming data using SQL. You can filter or aggregate data in real time. Kinesis Firehose : To load streams of data into S3, Redshift, Splunk or Elastic Search. It is a delivery stream. Kinesis Data Streams : Streams are divided into shards. To scale up application we can update shard configuration by increasing number of shards. By default shard's data can be retained for 1 Day but you can extend it for 7 days. Multiple application can use same stream. Real-time processing of data with a scale of throughput. Record size should not

AWS IOT JITR (Just in Time registration) with Thing and Policy creation using JAVA

AWS IOT JITR with Thing and Policy creation using JAVA. This POC will provide Just In Time Registration (JITR) of custom certificate and Thing creation with connect policy for AWS IOT Devices. You just need to add name of thing in common name while creation of device certificate and thing will be created with attached policy & certificate and common name as thing name. Project Overview: Get certificate details from certificate id. Parse certificate details and get common name from certificate. Creates IOT policy having action of connect. Creates IOT thing with name from certificate common name. Attach policy and thing to certificate. Activate Certificate. Now your device can connect to AWS using this custom certificate. Step for JITR & Thing creation Create CA Certificate: openssl genrsa -out CACertificate.key 2048 openssl req -x509 -new -nodes -key CACertificate.key -sha256 -days 365 -out CACertificate.pem Enter necessary details like city, country, et