Optimization

From UG

Revision as of 13:11, 7 May 2012 by Perry (Talk | contribs)
Jump to: navigation, search


Contents

Info

  • parent mantis: 0000028: [Optimization] .... parent
  • mantis category: Optimization

Requirements

Keep system speed/performance at the normal levels NOW and IN THE FUTURE.

It is assumed that to satisfy above we have to:

  • Optimize existing code, DB, etc to
  • Develop strategy for the future development

[Perry: 2/22/2012 comments/ideas regarding approach and strategy]

  • Separate web/app server from database server.
  • Introduce scheduled batch reporting features for processor/memory intensive reports
  • Develop plan to proactively monitor server utilization (Evaluate available tools - I know BMC Software has tools to report on various data points across a specified period of time)
  • Look for MySQL and Tomcat tools to monitor SQL statements and long running processes. Evaluate database explain plans - optimize SQL statements and introduce indexes where necessary.

SOW 0

  • 0001357: (Misc) Create CT2 2015 DB (Ops/Acc modules) Test speed, optimize (requirements for how fast each CT2 table will grow attached)
	For every table:
a) record how many records are now
b) find out (interview BAs) - how many records will be there at the end of 2010
c) estimate rate of growth per year after 2010
d) post this info into http://ct.jaguarfreight.com/wiki/Intro_into_CT2_DB [^] [^]

Based on the info above CT2 Architect will:

a) create database similar in size to dec 2010 levels
b) estimate speed of application
c) compare above with target
d) optimize DB and algorithms to reach the target speed 

SOW 1

* 0002233: (CT2 Misc) Ongoing DB optimization (SOW 1)

Implementation Notes SOW 1

  • P/l for mult cts report refactoring
  • Billing required report refactoring
  • P/l report for CT group and for Master - lines from one PI for different cts was grouped into one line per Marc's request

SOW 2

  • 0003312: (CT2 Misc) Ongoing DB optimization (SOW 2)

Implementation Notes SOW 2


1. Refactored ChargeCodes to ShipmentInvoices relation table (direct access without using tblTableOfCharges table).
2. Reworked ChargeCode persistence methods to speed-up Invoices access. Was improved speed of reports generation 

report p&l for multiple cts "Special View: Charge Code Group" optimized.
this report now works 10-20 times faster

SOW 3

Implementation Notes SOW 3

  • Refactor IsBasicApproval field (Boolean) to PlannerType field. Will simplify Queries logic.
  • Migrate current Planner Shipment Authorization Status fields to tblGenericShipment table
    • Avoid joining pair of long tables tblGenericShipment and tblShipmentAuthorizationStatusHistory
    • improve performance of Planner/Shipper portal and Reporting
  • Refactored Query class
    • removed quotation marks of Integer Parameter in Queries composition. As Vlad noted, comparing Integer to Character values has significant negative effect on performance.


SOW 4

Kostya: I need to add indexes and constraints to PDF dosc tables - for individual cts and masters. To implement this I should drop some "garbage" records from these several tables so I need time to review existing records and remove all that not have links to shipments that correnctly database has. Only after this I can add needed constraints to the tables. This indexes and constraints will improve database integrity and force speed.

SOW 5 Re-architect EDI interface

  • 0003523: Re-architect the interface: connect problem between CT2 & FTP Server for TMS, Air Status & EUAN BP.(related: 0003479)

Main Objective: Prevent a single FTP thread from indefinitely blocking other FTP threads due to any error condition (either connection problem, transmission problem, etc..). This applies to all FTP interfaces including TMS, Trendset, etc..

Proposed Solution:

  • Add timeout to all FTP processes to return control back to the application after a predetermined timeout value. Timeout value should not be hard-coded, but either config files or database driven.
  • Modify existing FTP processes that delete and then copy a file from/to the same server to issue rename/move commands instead. Waste a lot of bandwidth by deleting and then copying the same file over the same server.
  • FTP process currently attempts to create directory to store invalid files for each session. Modify process to only issue this command after initial environment build. After an environment is built, then do not issue this command for every FTP session.
  • (Nice-to-have and will not need to implement in this SOW) Modify process to not open/close FTP connection for each and every file. Keep an FTP session open for a set time period to process all requests for that cycle, and then close the connection.
Personal tools