Professional Documents
Culture Documents
Summer Internship Report
Summer Internship Report
Submitted to
Submitted to Submitted by
Prof.(Dr.) Deepak Kumar Himanshu Sharma
A1oo4819195
Batch-2o22
DECLARATION
I Himanshu Sharma student of BCA hereby declare that the Seminar titled Software
Engineering which is submitted by me to , Amity Institute of Information
Technology, Amity University Uttar Pradesh, Noida, in partial fulfillment of
requirement for the award of the degree of , has not been previously formed the basis
for the award of any degree, diploma or other similar title or recognition.
The Author attests that permission has been obtained for the use of any copy righted
material appearing in the Summer Internship report other than brief excerpts requiring
only proper acknowledgement in scholarly writing and all such use is acknowledged.
I would like to express my deep gratitude to Mr. Anand Verma and Ms.
Anubha Goyal, my training coordinator for their Constant co-operation.
They were always there with his competent guidance and valuable
suggestion Throughout the pursuance of this research project.
TABLE 1.1
Type Public
J.P. Morgan gives resource the board, speculation banking, depository and protections
administrations, private banking, and business banking administrations to a portion of
the world's biggest customers. J.P. Morgan is settled in New Yorke, and has global
workplaces in London, Tokyo, Hong Kong, Singapore, Sao Paolo, and Mumbai,
among others.
OVERVIEW
J.P. Morgan is essential for New York-based JPMorgan Chase and Co., a monetary
administrations firm with around 257,ooo workers. JPMorgan Chase additionally
furnishes shoppers and independent company with a scope of monetary
administrations and items. J.P. Morgan's celebrated history, which incorporates a few
consolidations, dates to 1799, when the New York State Legislature sanctioned The
Manhattan Company to supply "unadulterated and healthy" water to the residents of
New York City. J.P. Morgan additionally has European roots. At the point when J.
Pierpont Morgan set up J.P. Morgan and Co. in New York in 1871, the bank at first
filled in as a New York deals and conveyance office for his dad's firm, J.S. Morgan
and Co., a guarantor of European protections.
In 2ooo, the Chase Manhattan Bank converged with J.P. Morgan and Co. In 2oo4,
Bank one and JPMorgan Chase united, with the CEo of Bank one, Jamie Dimon,
assuming control over the consolidated association's reins. In 2oo8, JPMorgan Chase
broadly got its sickly rival Bear Stearns. Albeit the firm endured the monetary
emergency better than the majority of it's anything but, a 2o12 exchanging misfortune
abundance of $5 billion discolored the association's generally perfect record.
Today, the firm is the top financier of worldwide obligation and value endorsing
volume on Wall Street, and one of the main venture banks with regards to worldwide
M&A bargain volume.
Card Services
The organization has 94 million cards available for use and $135 billion in oversaw
credits. It is the second-biggest backer of Mastercards in the U.S. furthermore, the
biggest trader acquirer.
Commercial Banking
JPMorgan Chase has 3o,ooo center market clients, 1,7oo corporate financial clients
and 1,1oo business land banking clients.
Awards
Best Banking Performer, United States 0f America in 2o16 by Global Brands
Magazine Award
TASKS GIVEN
Software Engineering Task : code changes
TASK 1
-Interface with a stock price data feed and create tests for it
Essential
Set up ought to have been finished. This implies, your worker and customer
applications ought to have been running without any issues without acquainting any
progressions with the code yet. You can confirm this in the event that you get a
comparative outcome to any of the accompanying that incorporate an image of the
worker and customer application running together
OBJECTIVES
● In the event that you intently review the yield of the customer applications in the
screen captures, there are two wrong things …
○ (1) Ratio is always 1
○ (2) The price of each stock is always the same as its bid_price.
● These are obviously wrong the job is to fix those things…
● You’ll be making changes to the code in the some of the files within the repository
you cloned or downloaded to achieve the OBJECTIVES the task.
● To do this, you can utilize any content manager your machine has and quite recently
open the documents in the store that should be changed
Sublime Text
VSCode
As these are the most commonly used code editors
main method.
Making changes in ‘client.py’ (client3.py for python3) main
● To review the changes in main (whether it was in python2 or python3), what we did was create a
prices dictionary to store the stock prices. Think of a dictionary as a key-value store wherein you can
specify a key and be able to retrieve a value. The key was the stock name and the value was the price.
● We then used this prices dictionary at the end to pass in the right values in the getRatio function.
Essential
Set up ought to have been finished. This implies, your worker and customer
applications ought to have been running without any issues without acquainting any
progressions with the code yet. You can confirm this in the event that you get a
comparative outcome to any of the accompanying that incorporate an image of the
worker and customer application running together
Notice Initial State of Client App in Browser
This is the way the underlying condition of the customer application seems as though
when you click the blue “Start Streaming Data” button a number of times
In the event that you tapped on the 3-specked catch on the upper left corner of the diagram you'll
see something like
Image(a) on this page. This tells you that the graph is configurable
Image(b) further shows you the different types of views you can use to visualize the data you have
so far
Notice Initial State of Client App in Browser
If looked back at the data again, you’ll also observe it’s just a bunch of duplicate data
being printed for stocks ABC and DEF until such point that there becomes newer data
i.e. different timestamp, ask_price and bid_price for ABC and DEF stocks But the
printing of duplicated data doesn’t seem useful at all...
OBJECTIVES
● There are two things we need to accomplish here to finish this task
○ (1) Make the graph continuously update instead of having to click it a bunch of
times. Also the kind of graph we want to serve as visual here is kind of a continuously
updating line graph whose y axis is the stock’s top_ask_price and the x-axis is the
timestamp of the stock
○ (2) Remove / disregard the duplicated data we saw earlier…
● The kind of graph we want to up with is something like this:
● To accomplish this we need to change (2) files: src/App.tsx and src/Graph.tsx
● You can utilize any content tool your machine has and quite recently open the
documents in the vault that should be changed
● Next, in the constructor of the App component, you should define that the initial
state of the App not to show the graph yet. This is because we want the graph to show
when the user clicks ‘Start Streaming Data’. That means you should set `showGraph`
property of the App’s state to `false` in the constructor
● To ensure that the graph doesn’t render until a user clicks the ‘Start Streaming’
button, you should also edit the `renderGraph` method of the App. In there, you must
add a condition to only render the graph when the state’s `showGraph` property of the
App’s state is `true`.
● Finally, you must also modify the `getDataFromServer`method to contact the server
and get data from it continuously instead of just getting data from it once every time
you click the button.
● Javascript has a way to do things in intervals and that is via the setInterval function.
What we can do to make it continuous (at least up to an extended period of time) is to
have a guard value that we can check against when to stop / clear the interval process
we started.
● If you noticed in the image, it's in the same method, getDataFromServer, that we set
showGraph to true as soon as the data from the server comes back to the requestor.
● The line DataStreamer.getData(... => ...) is a nonconcurrent cycle that gets the
information from the worker and when that interaction is finished, it then, at that point
performs what comes after the => as a callback function
● Changes in App.tsx end here.
● By now you should’ve accomplished modifying the client to request data from
server continuously
● By now you should’ve also accomplished setting the initial state of the graph not to
show until the user clicks the “Start Streaming Data” button
● Finally, you need to add more attributes to the element. For this you have to have
read thru the Perspective configurations particularly on the table.view configurations.
You’ll need to add the following attributes: `view`, `column-pivots`, `row-pivots`,
`columns` and `aggregates` . If you remember the earlier observations we did in the
earlier, this is the configurable part of the table/graph. The end result should look
something like:
● ‘view’ is the the kind of graph we wanted to visualize the data as. Initially, if you
remember this was the grid type. However, since we wanted a continuous line graph
to be the final outcome, the closest one would be y_line
● ‘column-pivots’ is what will allow us to distinguish stock ABC with DEF. Hence
we use ‘[“stock”]’ as its corresponding value here. By the way, we can use stock here
because it’s also defined in the schema object. This accessibility goes for the rest of
the other attributes we’ll discuss.
● ‘row-pivots’ takes care of our x-axis. This allows us to map each datapoint based on
the timestamp it has. Without this, the x-axis is blank.
● ‘columns’ is what will allow us to only focus on a particular part of a stock’s data
along the y-axis. Without this, the graph will plot different datapoints of a stock ie:
top_ask_price, top_bid_price, stock, timestamp. For this instance we only care about
top_ask_price
● ‘aggregates’ is what will allow us to handle the duplicated data we observed earlier
and consolidate them as just one data point. In our case we only want to consider a
data point unique if it has a unique stock name and timestamp. otherwise, if there are
duplicates like what we had before, we will average out the top_bid_prices and the
top_ask_prices of these ‘similar’ datapoints before treating them as one.
● Changes in Graph.tsx are done too
END RESULT
TASK 3
-Display data visually for traders
Essential
● Set up ought to have been finished. This implies, your worker and customer
applications ought to have been running without any issues without acquainting any
progressions with the code yet. You can confirm this in the event that you get a
comparative outcome to any of the accompanying that incorporate an image of the
worker and customer application running together
Notice Initial State of Client App in Browser
This is the way the underlying condition of the customer application seems as though
when you click the blue “Start Streaming Data” button It’s pretty much like the state
of task 2.
You have two stocks displayed and their top_ask_price changes being tracked through
a timeline
In the event that you tapped on the 3-specked catch on the upper left corner of the
diagram you'll see something like image on this slide.
This tells you that the graph is configurable.
OBJECTIVES
● There are two things we need to accomplish here to finish this task
○ (1) We currently need to make this chart more valuable to merchants by making it
track the proportion between two stocks after some time and NoT the two stocks'
top_ask_price over the long haul.
○ (2) As referenced previously, brokers need to screen the proportion of two stocks
against a recorded connection with upper and lower edges/limits. This can assist them
with deciding an exchanging opportunity.That said, we additionally need to make this
diagram plot those upper and lower edges and show when they get crossed by the
proportion of the stock
● In the end we want to achieve a graph that looks something like this
● ‘view’ is the the kind of graph we wanted to visualize the data as. Initially, this is
already set to y_line. This is the type of graph we want so we’re good here.
● ‘column-pivots’ used to exist and was what allowed us to distinguish / split stock
ABC with DEF back in task 2. We removed this because we’re concerned about the
ratios between two stocks and not their separate prices
● ‘row-pivots’ takes care of our x-axis. This allows us to map each datapoint based on
the timestamp it has. Without this, the x-axis is blank. So this field and its value
remains
● ‘columns’ is what will allow us to only focus on a particular part of a datapoint’s
data along the y-axis. Without this, the graph will plot all the fields and values of each
datapoint and it will be a lot of noise. For this case, we want to track ratio,
lower_bound, upper_bound and trigger_alert.
● ‘aggregates’ is what will allow us to handle the cases of duplicated data we
observed way back in task 2 and consolidate them as just one data point. In our case
we only want to consider a data point unique if it has a timestamp. otherwise, we will
average out the all the values of the other non-unique fields these ‘similar’ datapoints
before treating them as one (e.g. ratio, price_abc, …)
● Finally, we have to make a slight update in the componentDidUpdate method. This
method is another component lifecycle method that gets executed whenever the
component updates, i.e. when the graph gets updated in our case. The change we want
to make is on the argument we put in this.table.update. This is how it’s supposed to
look like after the change:
There’s a reason why we did this change. That we’ll see in next change which is
changes in DataManipulator.ts
● observe how we’re able to access serverRespond as an array where in the first
element (0-index) is about stock ABC and the second element (1-index) is about stock
DEF. With this, we were able to easily just plug in values to the formulas we used
back in task 1 to compute for prices and ratio properly
● Also note how the return value is changed from an array of Row objects to just a
single Row object This change explains why we also adjusted the argument we passed
to table.update in Graph.tsx earlier so that consistency is preserved.
● The upper_bound and lower_bound are pretty much constant for any data point.
This is how we will be able to maintain them as steady upper and lower lines in the
graph. While 1.o5 and o.95 isn’t really +/-1o% of the 12 month historical average
ratio (i.e. 1.1 and o.99) you’re free to play around with the values and see which has a
more conservative alerting behavior.
● The trigger_alert field is basically a field that has a worth (for example the
proportion) if the limit is passed by the proportion-. otherwise if the ratio remains
within the threshold, then no value/undefined will suffice.
● Changes in Graph.tsx and DataManipulator.ts are done.
END RESULT
REFERENCES
https://www.jpmorganchase.com/
https://en.wikipedia.org/wiki/JPMorgan_Chase
www.stackoverflow.com
SYNOPSIS
AMITY INSTITUTE OF INFORMATION & TECHNOLOGY
Summer Internship
WEEKLY PROGRESS REPORT (WPR)
For the week commencing on: 17/05/2021
Week’s Summary
Days/ Time
Tuesday Waited for responses, and even cleared some rounds for selection
Friday -
Week’s Summary
Days/ Time
Tuesday -
Thursday Got to know about the different technologies and laguages to be used.
Friday Learning more about REPL and PERSPECTIVE and how they work.
Week’s Summary
Days/ Time
Week’s Summary
Days/ Time
gnat
PLAGIARISM REPORT