Datawarehouse practitioners are thinking that the biggest efforts to build datawerehouse nowadays lies o ETL processing. The complexity of workload ETL processing depend on various and hetegeroneous data source profile that will be collected. A study to decrease the workload of ETL processing in datawerehouse development stages has been development. A new concept of localized data source cleansing has been proposed. The consideration of inconsistent, non formal , expected existing and duplicated data source in localized data source's profiles should be locally identified. It is expected that this consideration will ligthen and shorten the ETL processing so the workload performance of ETL processing will be better.An investigation to the impact of localized and non localized heterogenious data cleansing has been done. Based on this investigation an automatic localised data cleansing and integration system has been defined. It is a cleansing processing for aech data source profile whic will be executed in the trancsactional data source site. It means this process will be done before the datawarehouse developement stages will decrease. It is proven that decreasing number of raw data through locally cleansing process became significant for data with lack of integrity contraint and lack of format data checking procedures.