Compare commits

...

111 커밋

작성자 SHA1 메시지 날짜
cfc80bbb0d feat: Gitea 팀 프로젝트 워크플로우 구조 적용
- .claude/rules/: 팀 정책, Git 워크플로우, 코드 스타일, 네이밍, 테스트 규칙
- .claude/skills/: init-project, sync-team-workflow, create-mr, fix-issue
- .claude/settings.json: deny 규칙 + hooks
- .claude/workflow-version.json: v1.2.0 적용
- .githooks/: commit-msg(grep -P→-E macOS 호환), pre-commit, post-checkout
- .editorconfig, .sdkmanrc, .mvn/settings.xml (Nexus 미러)
- .gitignore: .claude/ 팀 파일 추적 전환
- CLAUDE.md: 프로젝트 루트로 이동

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-14 22:00:24 +09:00
0743fd4322 chore: 불필요 스크립트 삭제
- scripts/collect_signalkind_candidates.sh 제거

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-14 21:54:29 +09:00
82d427bda2 chore: 불필요 문서 삭제
- DEVELOPMENT_GUIDE.md (49KB) 삭제 - CLAUDE.md로 대체
- SWAGGER_GUIDE.md (16KB) 삭제 - Swagger 자동 생성으로 대체

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-14 21:53:06 +09:00
290933f94f chore: Kafka topic명 변경 및 SignalKind 수집 스크립트 추가
- tp_SNP_AIS_Signal → tp_Global_AIS_Signal (3개 프로파일)
- scripts/collect_signalkind_candidates.sh 추가

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-14 21:52:50 +09:00
LHT
178ac506bf feat: add AIS target Kafka producer pipeline 2026-02-13 03:10:38 +09:00
hyojin kim
07368f18cb 🔥 application.yml 설정 변경 2026-02-12 10:41:27 +09:00
hyojin-kim4
a93942d4d6
🔀 용어 표준화 반영 (AIS 제외) (#6)
* 🔧 Schema/Table 하드코딩 제거

* 🔥 BatchSchemaProperties.java 제거 및 @Value Schema 설정 방식 통일

* 🗃️ 용어 표준화

- Facility Port 
- Common Code
- Risk&Compliance
- Movement 
- Event 
- PSC 
- 선박제원정보
2026-02-12 10:27:22 +09:00
hyojin-kim4
f53648290c
🔀 데이터 값 검증 컬럼 추가 (#4)
* 🗃️ PSC : 값 검증 컬럼 추가

* 🗃️ Facility : 값 검증 컬럼 추가

* 🔊 Facility : API Request 로그 추가

* 🗃️ Event : 값 검증 컬럼 추가

* 🗃️ Movement : 값 검증 컬럼 추가

* 🗃️ 공통코드 : 값 검증 컬럼 추가, API 로그 서비스 추가

* 🗃️ IMO 메타 수집 : 값 검증 컬럼 추가, API 로그 서비스 추가

* 🗃️ Risk&Compliance : 값 검증 컬럼 추가

* 🗃️ 선박제원정보 : 값 검증 컬럼 추가, 해시값 비교 프로세스 제거

* 🗃️ schema change : snp_data -> t_snp_data
2026-02-05 18:49:27 +09:00
hyojin kim
6555c5e28f Merge branch 'main' into develop 2026-01-23 15:06:58 +09:00
hyojin kim
3cbc2d2e94 Merge branch 'dev_movements' into develop 2026-01-21 14:36:14 +09:00
hyojin kim
a59c91ae1f Merge branch 'dev_psc' into develop 2026-01-21 14:36:07 +09:00
hyojin kim
30304de4e6 🗃️ ship_detail_data,additionalshipsdata : datasetversion 컬럼 수집 추가 2026-01-21 14:31:56 +09:00
hyojin kim
7a1b24e381 🗃️ Dark Activity Confirmed : area_country 컬럼 수집 추가 2026-01-21 13:30:26 +09:00
hyojin kim
8d2cd09725 🗃️ PSC 수집 제외 컬럼 반영 2026-01-21 13:20:53 +09:00
hyojin kim
6c4ce9a536 🗃️ Terminal Call 수집 누락 컬럼 추가 2026-01-21 11:17:42 +09:00
hyojin kim
9fed34e1bc 🔥 Risk&Compliance Current/History 수집 방식 변경 2026-01-20 10:09:59 +09:00
hyojin kim
21368ffaff 🐛 Insert 쿼리 오류 수정 2026-01-19 15:30:13 +09:00
hyojin kim
7ab53d1bbf 🔥 선박제원정보의 Company Compliance 수집 제거 2026-01-19 10:49:54 +09:00
hyojin kim
613980c496 🔥 선박제원정보의 Company Compliance 수집 제거 2026-01-19 09:43:33 +09:00
hyojin kim
e63607a69d Company Compliance 수집 JOB 추가 2026-01-16 17:12:04 +09:00
hyojin kim
f4421fa455 선박제원정보 요청 단위 변경 2026-01-16 14:17:06 +09:00
hyojin kim
43057d74fb Company Detail 수집 프로세스 추가 2026-01-16 14:15:00 +09:00
hyojin kim
64a3a55e78 batch_api_log 관리 프로세스 추가 2026-01-15 15:58:20 +09:00
hyojin kim
f2c4e0d14f 🔇 Web Services API Log Control 2026-01-12 15:11:05 +09:00
hyojin kim
5305f61a41 🔇 Ships API Log Control 2026-01-12 14:41:08 +09:00
hyojin kim
c3dabd370c Merge branch 'develop' into dev_shipdetail_sync 2026-01-09 16:07:28 +09:00
hyojin kim
9c021f298c Add Ship Detail Sync Job 2026-01-09 16:07:00 +09:00
hyojin kim
cbb53fd9f1 🗃️ Core 캐시 대상 변경 2026-01-09 14:59:20 +09:00
49d2de1965 AIS Target DB Sync Job 분리 (캐시→DB 15분 주기)
- AisTargetDataWriter: DB 저장 제거, 캐시 업데이트만 수행
- AisTargetDbSyncJob 신규 생성: 15분 주기 캐시→DB 동기화
- AisTargetDbSyncTasklet: 캐시에서 최근 15분 데이터 조회 후 UPSERT
- application.yml: ais-target-db-sync 설정 추가

데이터 흐름 변경:
- 기존: API(1분) → 캐시 + DB (매분 33K 건 저장)
- 변경: API(1분) → 캐시만, DB는 15분마다 MMSI별 최신 1건 저장

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 14:25:27 +09:00
hyojin kim
1ab78e881f 🔊 API Response Error Log Update 2026-01-09 13:39:18 +09:00
hyojin kim
4e79794750 chunk & batch size change 2026-01-09 10:21:10 +09:00
hyojin kim
abe5ea1a1c Merge branch 'dev_batchflag' into develop 2026-01-08 15:59:01 +09:00
hyojin kim
d8b8a40316 🗃️ remove batch_flag of new_snp schema 2026-01-08 15:57:46 +09:00
hyojin kim
b842ec8d54 🗃️ Crew List Unique Index Change 2026-01-08 15:28:03 +09:00
hyojin kim
e1fa48768e 💥 API 조회 기간 세팅 방식 변경 및 통일 2026-01-08 15:12:06 +09:00
hyojin kim
87a9217853 🗃️ ais_target ddl update 2026-01-07 13:18:10 +09:00
hyojin kim
6e70e921af 🗃️ AIS Target 변경으로 인한 데이터 및 컬럼추가 2026-01-05 17:42:53 +09:00
hyojin kim
3fb133e367 🗃️ core20 컬럼 추가 : AIS 추가 컬럼 2026-01-05 15:04:07 +09:00
hyojin kim
31262f5dda 🔇 로그 범위 변경 2025-12-31 13:59:23 +09:00
hyojin kim
99fcd38d24 🗃️ procedure change 2025-12-31 12:38:07 +09:00
hyojin kim
7360736cb0 🏗️ Movement Batch Package Rearrange 2025-12-31 10:53:31 +09:00
hyojin kim
6aba0f55b0 🗃️ Event Table Name Change
- SQL Injection Prevent
2025-12-31 10:37:20 +09:00
hyojin kim
1d2a3c53c8 Add Compliance History Value Change Manage Step 2025-12-31 09:59:25 +09:00
hyojin kim
020f16035b Merge branch 'develop' of https://github.com/GC-IncheonService-KDN/SNP-Batch into develop 2025-12-29 18:02:31 +09:00
hyojin kim
94f7d4b5c0 🔨 Multi Step Job Config 추가 2025-12-29 18:02:18 +09:00
Kim JiMyeung
0a5e2e56af Batch 파라미터 request 적용 2025-12-29 15:35:18 +09:00
hyojin kim
32af369f23 🗃️ Last Postion Update 대상 스키마 변경 2025-12-24 14:24:17 +09:00
hyojin kim
fcf1d74c38 Risk&Compliance Range Import Update 2025-12-24 14:15:13 +09:00
hyojin kim
5683000024 Merge branch 'dev_event' into develop 2025-12-23 14:39:43 +09:00
Kim JiMyeung
a7cf1647f8 event속성들 snp_data 적재 -> new_snp 적재 2025-12-23 14:33:53 +09:00
hyojin kim
6d7b7c9eea Merge branch 'dev_event' into develop 2025-12-23 12:36:48 +09:00
hyojin kim
6885d41ba5 Merge branch 'dev_shipdetail' of https://github.com/GC-IncheonService-KDN/SNP-Batch into dev_shipdetail 2025-12-23 12:35:13 +09:00
hyojin kim
7b1fe1d52c 🗃️ Ship Data 스키마 변경 2025-12-23 12:33:10 +09:00
hyojin kim
bff4de17c7 🗃️ chunk size change 2025-12-23 11:28:17 +09:00
hyojin kim
bda2d812ff 🗃️ Ship Data 스키마 변경 2025-12-23 11:23:29 +09:00
Kim JiMyeung
1124c2e84a risk, compliance잡 range형태로 수정 2025-12-23 09:42:50 +09:00
Kim JiMyeung
75531ab5e5 startDate, endDate로직처리 2025-12-22 13:11:25 +09:00
hyojin kim
4700ec862b 💩 임시커밋 2025-12-19 17:13:40 +09:00
Kim JiMyeung
e7ea47b02c Merge branch 'dev_movement_daterange' into dev_event 2025-12-19 13:59:38 +09:00
Kim JiMyeung
63e9253d7f Movement Method Range형식으로 변경 2025-12-19 13:37:35 +09:00
hyojin kim
acd76bd358 Event Detail 적재 프로세스 개발
- StartDate, EndDate 추출작업 필요
2025-12-19 10:57:40 +09:00
hyojin kim
270b2a0b55 ⚰️ 불필요한 주석 제거 2025-12-16 16:02:08 +09:00
hyojin kim
084be88b98 S&P 국가코드,선박유형코드 Import Job 2025-12-16 15:56:02 +09:00
hyojin kim
fb10e3cc39 🦖 선박제원정보 테이블 변경 core20 > ship_detail_data 2025-12-16 10:20:46 +09:00
hyojin kim
b2167d4ec7 Event Range 세팅방식 변경
- API_KET 세팅방식 변경
2025-12-15 13:31:42 +09:00
hyojin kim
630c366a06 Merge branch 'dev_ship_movement' into develop 2025-12-15 10:16:25 +09:00
Kim JiMyeung
e7f4a9d912 AnchorageCalls, Berthcalls, DarkActivity, StsOperations, TerminalCalls Job 개발 2025-12-15 10:09:18 +09:00
hyojin kim
1c491de9e2 🗃️ application.xml 수정 2025-12-12 15:34:02 +09:00
Kim JiMyeung
3118df3533 Merge remote-tracking branch 'origin/develop' into dev_ship_movement 2025-12-12 14:48:49 +09:00
hyojin kim
090f009529 ShipDetailUpdateJob 개발
- CrewList
- StowageCommodity
- GroupBeneficialOwnerHistory
- ShipManagerHistory
- OperatorHistory
- TechnicalManagerHistory
- BareBoatCharterHistory
- NameHistory
- FlagHistory
- AdditionalInformation
- PandIHistory
- CallSignAndMmsiHistory
- IceClass
- SafetyManagementCertificateHistory
- ClassHistory
- SurveyDatesHistory
- SurveyDatesHistoryUnique
- SisterShipLinks
- StatusHistory
- SpecialFeature
- Thrusters
2025-12-12 13:12:40 +09:00
Kim JiMyeung
c46a62268c reader 수정 2025-12-12 11:20:13 +09:00
Kim JiMyeung
f2970872fd mvmn_type on conflict추가 2025-12-12 11:14:10 +09:00
Kim JiMyeung
ac78a1340a Merge branch 'dev_ship_movement' of https://github.com/GC-IncheonService-KDN/SNP-Batch into dev_ship_movement 2025-12-11 16:31:18 +09:00
Kim JiMyeung
3ee6ae1bf7 pscJob 2025-12-11 16:29:28 +09:00
hyojin kim
2a0a80098d Merge branch 'develop' into dev_ship_movement 2025-12-10 12:33:57 +09:00
hyojin kim
eb81be5f21 🗃️ application.xml 정리 2025-12-10 10:54:44 +09:00
hyojin kim
655318e353 🗃️ Risk&Compliance 적재방식 변경 (이력데이터 적재) 2025-12-10 10:13:09 +09:00
hyojin kim
2e509560de Merge branch 'ais/ship_position' into develop 2025-12-10 08:54:42 +09:00
fedd89c9ca [수정]
- GPU DB core20 테이블 정보 프로파일 추가
2025-12-10 08:46:15 +09:00
3dde3d0167 [추가]
- 실시간 선박 위치 조회 API Classtype 구분 파라미터 추가 (core20 테이블 imo 유무로 ClassA, ClassB 분류)
 - html PUT,DELETE, PATCH 메소드 제거 및 POST 메소드 사용 변경 (보안이슈)
2025-12-10 08:14:28 +09:00
Kim JiMyeung
6c98ebc24f Destination, Transits, CurrentlyAt 증분Job 2025-12-08 17:47:30 +09:00
Kim JiMyeung
18ab11068a 빈 배열 처리 로직추가 2025-12-08 13:33:57 +09:00
hyojin kim
37f61fe924 Add Port Import Job, Event Import Job 2025-12-08 13:33:37 +09:00
hyojin kim
e9b30f8817 🗃️ JPA 스키마 지정 (snp_data) 2025-12-08 13:33:23 +09:00
Kim JiMyeung
34ce85f33f Merge remote-tracking branch 'origin/develop' into dev_ship_movement 2025-12-08 13:17:06 +09:00
Kim JiMyeung
919b0fc21a AnchorageCalls, Berthcalls, DarkActivity, StsOperations, TerminalCalls 증분Job 2025-12-08 13:00:08 +09:00
Kim JiMyeung
7941396d62 ais/ship_position into dev_ship_movement 2025-12-05 11:00:28 +09:00
Kim JiMyeung
248e9c2c46 /snp-asi url추가 2025-12-05 10:17:08 +09:00
Kim JiMyeung
2671d613f3 merge devlop into dev_ship_movement 2025-12-05 09:44:20 +09:00
hyojin kim
1b7fa47dbd Merge branch 'ais/ship_position' into develop 2025-12-05 09:33:59 +09:00
8d8ea53449 [추가]
- 프로세스 재기동 등으로 정상 종료되지 않은 Job 정리용 임시 sql 추가
2025-12-05 08:31:11 +09:00
322ecb12a6 [수정]
- url 하드코딩 제거
- bootstrap 로컬 저장, 참조수정
2025-12-04 15:38:01 +09:00
55d4dd5886 [수정]
- 파티션 관리 job 추가 (+3일 미리 생성, 14일 이전 파티션 자동drop 설정)
- (임시) GPU 운영 포트 9000번 변경
- ais_target 테이블 일일 파티션구조로 변경 (1일 데이터 약 20GB)
2025-12-04 13:05:00 +09:00
hyojin kim
c842e982c8 Merge branch 'dev_ship_movement' into dev_ship_detail
# Conflicts:
#	src/main/java/com/snp/batch/global/config/MaritimeApiWebClientConfig.java
2025-12-02 19:11:29 +09:00
hyojin kim
44ae82e2fa Merge branch 'ais/ship_position' into dev_ship_detail
# Conflicts:
#	src/main/java/com/snp/batch/jobs/sanction/batch/reader/ComplianceDataReader.java
#	src/main/resources/application.yml
2025-12-02 19:10:15 +09:00
hyojin kim
d6cf58d737 Add Port Import Job, Event Import Job 2025-12-02 18:26:54 +09:00
5857a4a822 [수정]
- 항해 조건 필터 검색 API (SOG/COG/Heading/Destination/Status)
- Swagger Status 필터 현행화
Under way sailing
N/A
AIS Sart
Restriced manoeuverability
Not under command
Engaged in fishing
Under way using engine
Anchored
Constrained by draught
Aground
Power Driven Towing Alongside
Power Driven Towing Astern
Moored
2025-12-02 16:44:14 +09:00
6af2fccbf0 [신규 기능]
- aisTargetImportJob: S&P Global AIS API 연동 (매 분 15초)
- AIS Target 조회 API (MMSI/시간/공간/폴리곤/WKT 검색)
- 항해 조건 필터 검색 API (SOG/COG/Heading/Destination/Status)
- Caffeine 캐시 적용 (TTL 120분, 최대 30만건)
- partitionManagerJob: 매일 1회 일별,월별 파티션 자동 생성

[개선]
- API context-path: /snp-api로 변경 (다른 API 서비스의 Proxy 설정 충돌 방지)
- BaseApiReader 상태 초기화 로직 추가 (재실행 시 0건 버그 수정)
- logback-spring.xml: 로그 파일 분리 및 롤링 정책 적용
2025-12-02 16:24:57 +09:00
Kim JiMyeung
c99b6993a7 빈 배열 처리 로직추가 2025-12-02 12:53:17 +09:00
hyojin kim
b3cb4f6f19 🗃️ JPA 스키마 지정 (snp_data) 2025-12-02 12:26:49 +09:00
hyojin kim
4282fc9106 🗃️ Risk&Compliance batch_flag 추가 2025-11-28 18:21:21 +09:00
hyojin kim
8a3e9a973e 🗃️ Risk&Compliance 인덱스 변경 반영 2025-11-28 10:46:44 +09:00
hyojin kim
68893f9657 🛂 운영서버 요청 URL 변경 2025-11-28 10:43:10 +09:00
hyojin kim
5787fb5be0 Merge branch 'dev_ship_movement' into dev_ship_detail 2025-11-27 22:20:34 +09:00
hyojin kim
4ed1070a37 Merge branch 'dev_ship_movement' into dev_ship_detail
# Conflicts:
#	src/main/java/com/snp/batch/global/config/MaritimeApiWebClientConfig.java
2025-11-27 22:20:21 +09:00
hyojin kim
f9b20bdc59 🗃️ 운영접속주소 수정 2025-11-27 22:03:09 +09:00
hyojin kim
7a405bb969 swagger 운영 주소 추가 2025-11-27 22:00:26 +09:00
hyojin kim
906611c9b8 Risk&Compliance Data Import Job 개발 2025-11-27 21:55:46 +09:00
Kim JiMyeung
e44637e1f3 movement 배치 2025-11-27 16:20:05 +09:00
hyojin kim
6be90723b4 Core20 테이블 AIS 컬럼 추가 (COG, NavStat) 2025-11-25 18:39:30 +09:00
hyojin kim
18fa95e903 🩹 OwnerHistory DataSetVersion 하드코딩 제거 2025-11-24 10:43:43 +09:00
177개의 변경된 파일4593개의 추가작업 그리고 7250개의 파일을 삭제

파일 보기

@ -0,0 +1,59 @@
# Java 코드 스타일 규칙
## 일반
- Java 17+ 문법 사용 (record, sealed class, pattern matching, text block 활용)
- 들여쓰기: 4 spaces (탭 사용 금지)
- 줄 길이: 120자 이하
- 파일 끝에 빈 줄 추가
## 클래스 구조
클래스 내 멤버 순서:
1. static 상수 (public → private)
2. 인스턴스 필드 (public → private)
3. 생성자
4. public 메서드
5. protected/package-private 메서드
6. private 메서드
7. inner class/enum
## Spring Boot 규칙
### 계층 구조
- Controller → Service → Repository 단방향 의존
- Controller에 비즈니스 로직 금지 (요청/응답 변환만)
- Service 계층 간 순환 참조 금지
- Repository에 비즈니스 로직 금지
### DTO와 Entity 분리
- API 요청/응답에 Entity 직접 사용 금지
- DTO는 record 또는 불변 클래스로 작성
- DTO ↔ Entity 변환은 매퍼 클래스 또는 팩토리 메서드 사용
### 의존성 주입
- 생성자 주입 사용 (필드 주입 `@Autowired` 사용 금지)
- 단일 생성자는 `@Autowired` 어노테이션 생략
- Lombok `@RequiredArgsConstructor` 사용 가능
### 트랜잭션
- `@Transactional` 범위 최소화
- 읽기 전용: `@Transactional(readOnly = true)`
- Service 메서드 레벨에 적용 (클래스 레벨 지양)
## Lombok 규칙
- `@Getter`, `@Setter` 허용 (Entity에서 Setter는 지양)
- `@Builder` 허용
- `@Data` 사용 금지 (명시적으로 필요한 어노테이션만)
- `@AllArgsConstructor` 단독 사용 금지 (`@Builder`와 함께 사용)
- `@Slf4j` 로거 사용
## 예외 처리
- 비즈니스 예외는 커스텀 Exception 클래스 정의
- `@ControllerAdvice`로 전역 예외 처리
- 예외 메시지에 컨텍스트 정보 포함
- catch 블록에서 예외 무시 금지 (`// ignore` 금지)
## 기타
- `Optional`은 반환 타입으로만 사용 (필드, 파라미터에 사용 금지)
- `null` 반환보다 빈 컬렉션 또는 `Optional` 반환
- Stream API 활용 (단, 3단계 이상 체이닝은 메서드 추출)
- 하드코딩된 문자열/숫자 금지 → 상수 또는 설정값으로 추출

파일 보기

@ -0,0 +1,84 @@
# Git 워크플로우 규칙
## 브랜치 전략
### 브랜치 구조
```
main ← 배포 가능한 안정 브랜치 (보호됨)
└── develop ← 개발 통합 브랜치
├── feature/ISSUE-123-기능설명
├── bugfix/ISSUE-456-버그설명
└── hotfix/ISSUE-789-긴급수정
```
### 브랜치 네이밍
- feature 브랜치: `feature/ISSUE-번호-간단설명` (예: `feature/ISSUE-42-user-login`)
- bugfix 브랜치: `bugfix/ISSUE-번호-간단설명`
- hotfix 브랜치: `hotfix/ISSUE-번호-간단설명`
- 이슈 번호가 없는 경우: `feature/간단설명` (예: `feature/add-swagger-docs`)
### 브랜치 규칙
- main, develop 브랜치에 직접 커밋/푸시 금지
- feature 브랜치는 develop에서 분기
- hotfix 브랜치는 main에서 분기
- 머지는 반드시 MR(Merge Request)을 통해 수행
## 커밋 메시지 규칙
### Conventional Commits 형식
```
type(scope): subject
body (선택)
footer (선택)
```
### type (필수)
| type | 설명 |
|------|------|
| feat | 새로운 기능 추가 |
| fix | 버그 수정 |
| docs | 문서 변경 |
| style | 코드 포맷팅 (기능 변경 없음) |
| refactor | 리팩토링 (기능 변경 없음) |
| test | 테스트 추가/수정 |
| chore | 빌드, 설정 변경 |
| ci | CI/CD 설정 변경 |
| perf | 성능 개선 |
### scope (선택)
- 변경 범위를 나타내는 짧은 단어
- 한국어, 영어 모두 허용 (예: `feat(인증): 로그인 기능`, `fix(auth): token refresh`)
### subject (필수)
- 변경 내용을 간결하게 설명
- 한국어, 영어 모두 허용
- 72자 이내
- 마침표(.) 없이 끝냄
### 예시
```
feat(auth): JWT 기반 로그인 구현
fix(배치): 야간 배치 타임아웃 수정
docs: README에 빌드 방법 추가
refactor(user-service): 중복 로직 추출
test(결제): 환불 로직 단위 테스트 추가
chore: Gradle 의존성 버전 업데이트
```
## MR(Merge Request) 규칙
### MR 생성
- 제목: 커밋 메시지와 동일한 Conventional Commits 형식
- 본문: 변경 내용 요약, 테스트 방법, 관련 이슈 번호
- 라벨: 적절한 라벨 부착 (feature, bugfix, hotfix 등)
### MR 리뷰
- 최소 1명의 리뷰어 승인 필수
- CI 검증 통과 필수 (설정된 경우)
- 리뷰 코멘트 모두 해결 후 머지
### MR 머지
- Squash Merge 권장 (깔끔한 히스토리)
- 머지 후 소스 브랜치 삭제

60
.claude/rules/naming.md Normal file
파일 보기

@ -0,0 +1,60 @@
# Java 네이밍 규칙
## 패키지
- 모두 소문자, 단수형
- 도메인 역순: `com.gcsc.프로젝트명.모듈`
- 예: `com.gcsc.batch.scheduler`, `com.gcsc.api.auth`
## 클래스
- PascalCase
- 명사 또는 명사구
- 접미사로 역할 표시:
| 계층 | 접미사 | 예시 |
|------|--------|------|
| Controller | `Controller` | `UserController` |
| Service | `Service` | `UserService` |
| Service 구현 | `ServiceImpl` | `UserServiceImpl` (인터페이스 있을 때만) |
| Repository | `Repository` | `UserRepository` |
| Entity | (없음) | `User`, `ShipRoute` |
| DTO 요청 | `Request` | `CreateUserRequest` |
| DTO 응답 | `Response` | `UserResponse` |
| 설정 | `Config` | `SecurityConfig` |
| 예외 | `Exception` | `UserNotFoundException` |
| Enum | (없음) | `UserStatus`, `ShipType` |
| Mapper | `Mapper` | `UserMapper` |
## 메서드
- camelCase
- 동사로 시작
- CRUD 패턴:
| 작업 | Controller | Service | Repository |
|------|-----------|---------|------------|
| 조회(단건) | `getUser()` | `getUser()` | `findById()` |
| 조회(목록) | `getUsers()` | `getUsers()` | `findAll()` |
| 생성 | `createUser()` | `createUser()` | `save()` |
| 수정 | `updateUser()` | `updateUser()` | `save()` |
| 삭제 | `deleteUser()` | `deleteUser()` | `deleteById()` |
| 존재확인 | - | `existsUser()` | `existsById()` |
## 변수
- camelCase
- 의미 있는 이름 (단일 문자 변수 금지, 루프 인덱스 `i, j, k` 예외)
- boolean: `is`, `has`, `can`, `should` 접두사
- 예: `isActive`, `hasPermission`, `canDelete`
## 상수
- UPPER_SNAKE_CASE
- 예: `MAX_RETRY_COUNT`, `DEFAULT_PAGE_SIZE`
## 테스트
- 클래스: `{대상클래스}Test` (예: `UserServiceTest`)
- 메서드: `{메서드명}_{시나리오}_{기대결과}` 또는 한국어 `@DisplayName`
- 예: `createUser_withDuplicateEmail_throwsException()`
- 예: `@DisplayName("중복 이메일로 생성 시 예외 발생")`
## 파일/디렉토리
- Java 파일: PascalCase (클래스명과 동일)
- 리소스 파일: kebab-case (예: `application-local.yml`)
- SQL 파일: `V{번호}__{설명}.sql` (Flyway) 또는 kebab-case

파일 보기

@ -0,0 +1,34 @@
# 팀 정책 (Team Policy)
이 규칙은 조직 전체에 적용되는 필수 정책입니다.
프로젝트별 `.claude/rules/`에 추가 규칙을 정의할 수 있으나, 이 정책을 위반할 수 없습니다.
## 보안 정책
### 금지 행위
- `.env`, `.env.*`, `secrets/` 파일 읽기 및 내용 출력 금지
- 비밀번호, API 키, 토큰 등 민감 정보를 코드에 하드코딩 금지
- `git push --force`, `git reset --hard`, `git clean -fd` 실행 금지
- `rm -rf /`, `rm -rf ~`, `rm -rf .git` 등 파괴적 명령 실행 금지
- main/develop 브랜치에 직접 push 금지 (MR을 통해서만 머지)
### 인증 정보 관리
- 환경변수 또는 외부 설정 파일(`.env`, `application-local.yml`)로 관리
- 설정 파일은 `.gitignore`에 반드시 포함
- 예시 파일(`.env.example`, `application.yml.example`)만 커밋
## 코드 품질 정책
### 필수 검증
- 커밋 전 빌드(컴파일) 성공 확인
- 린트 경고 0개 유지 (CI에서도 검증)
- 테스트 코드가 있는 프로젝트는 테스트 통과 필수
### 코드 리뷰
- main 브랜치 머지 시 최소 1명 리뷰 필수
- 리뷰어 승인 없이 머지 불가
## 문서화 정책
- 공개 API(controller endpoint)에는 반드시 설명 주석 작성
- 복잡한 비즈니스 로직에는 의도를 설명하는 주석 작성
- README.md에 프로젝트 빌드/실행 방법 유지

62
.claude/rules/testing.md Normal file
파일 보기

@ -0,0 +1,62 @@
# Java 테스트 규칙
## 테스트 프레임워크
- JUnit 5 + AssertJ 조합
- Mockito로 의존성 모킹
- Spring Boot Test (`@SpringBootTest`) 는 통합 테스트에만 사용
## 테스트 구조
### 단위 테스트 (Unit Test)
- Service, Util, Domain 로직 테스트
- Spring 컨텍스트 로딩 없이 (`@ExtendWith(MockitoExtension.class)`)
- 외부 의존성은 Mockito로 모킹
```java
@ExtendWith(MockitoExtension.class)
class UserServiceTest {
@InjectMocks
private UserService userService;
@Mock
private UserRepository userRepository;
@Test
@DisplayName("사용자 생성 시 정상 저장")
void createUser_withValidInput_savesUser() {
// given
// when
// then
}
}
```
### 통합 테스트 (Integration Test)
- Controller 테스트: `@WebMvcTest` + `MockMvc`
- Repository 테스트: `@DataJpaTest`
- 전체 플로우: `@SpringBootTest` (최소화)
### 테스트 패턴
- **Given-When-Then** 구조 사용
- 각 섹션을 주석으로 구분
- 하나의 테스트에 하나의 검증 원칙 (가능한 범위에서)
## 테스트 네이밍
- 메서드명: `{메서드}_{시나리오}_{기대결과}` 패턴
- `@DisplayName`: 한국어로 테스트 의도 설명
## 테스트 커버리지
- 새로 작성하는 Service 클래스: 핵심 비즈니스 로직 테스트 필수
- 기존 코드 수정 시: 수정된 로직에 대한 테스트 추가 권장
- Controller: 주요 API endpoint 통합 테스트 권장
## 테스트 데이터
- 테스트 데이터는 테스트 메서드 내부 또는 `@BeforeEach`에서 생성
- 공통 테스트 데이터는 TestFixture 클래스로 분리
- 실제 DB 연결 필요 시 H2 인메모리 또는 Testcontainers 사용
## 금지 사항
- `@SpringBootTest`를 단위 테스트에 사용 금지
- 테스트 간 상태 공유 금지
- `Thread.sleep()` 사용 금지 → `Awaitility` 사용
- 실제 외부 API 호출 금지 → WireMock 또는 Mockito 사용

78
.claude/settings.json Normal file
파일 보기

@ -0,0 +1,78 @@
{
"$schema": "https://json.schemastore.org/claude-code-settings.json",
"permissions": {
"allow": [
"Bash(./mvnw *)",
"Bash(mvn *)",
"Bash(java -version)",
"Bash(git status)",
"Bash(git diff*)",
"Bash(git log*)",
"Bash(git branch*)",
"Bash(git checkout*)",
"Bash(git add*)",
"Bash(git commit*)",
"Bash(git pull*)",
"Bash(git fetch*)",
"Bash(git merge*)",
"Bash(git stash*)",
"Bash(git remote*)",
"Bash(git config*)",
"Bash(git rev-parse*)",
"Bash(git show*)",
"Bash(git tag*)",
"Bash(curl -s *)",
"Bash(sdk *)"
],
"deny": [
"Bash(git push --force*)",
"Bash(git reset --hard*)",
"Bash(git clean -fd*)",
"Bash(git checkout -- .)",
"Bash(rm -rf /)",
"Bash(rm -rf ~)",
"Bash(rm -rf .git*)",
"Bash(rm -rf /*)",
"Read(./**/.env*)",
"Read(./**/secrets/**)",
"Read(./**/application-local.yml)"
]
},
"hooks": {
"SessionStart": [
{
"matcher": "compact",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-post-compact.sh",
"timeout": 10
}
]
}
],
"PreCompact": [
{
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-pre-compact.sh",
"timeout": 30
}
]
}
],
"PostToolUse": [
{
"matcher": "Bash",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-commit.sh",
"timeout": 15
}
]
}
]
}
}

파일 보기

@ -0,0 +1,65 @@
---
name: create-mr
description: 현재 브랜치에서 Gitea MR(Merge Request)을 생성합니다
allowed-tools: "Bash, Read, Grep"
argument-hint: "[target-branch: develop|main] (기본: develop)"
---
현재 브랜치의 변경 사항을 기반으로 Gitea에 MR을 생성합니다.
타겟 브랜치: $ARGUMENTS (기본: develop)
## 수행 단계
### 1. 사전 검증
- 현재 브랜치가 main/develop이 아닌지 확인
- 커밋되지 않은 변경 사항 확인 (있으면 경고)
- 리모트에 현재 브랜치가 push되어 있는지 확인 (안 되어 있으면 push)
### 2. 변경 내역 분석
```bash
git log develop..HEAD --oneline
git diff develop..HEAD --stat
```
- 커밋 목록과 변경된 파일 목록 수집
- 주요 변경 사항 요약 작성
### 3. MR 정보 구성
- **제목**: 브랜치의 첫 커밋 메시지 또는 브랜치명에서 추출
- `feature/ISSUE-42-user-login``feat: ISSUE-42 user-login`
- **본문**:
```markdown
## 변경 사항
- (커밋 기반 자동 생성)
## 관련 이슈
- closes #이슈번호 (브랜치명에서 추출)
## 테스트
- [ ] 빌드 성공 확인
- [ ] 기존 테스트 통과
```
### 4. Gitea API로 MR 생성
```bash
# Gitea remote URL에서 owner/repo 추출
REMOTE_URL=$(git remote get-url origin)
# Gitea API 호출
curl -X POST "GITEA_URL/api/v1/repos/{owner}/{repo}/pulls" \
-H "Authorization: token ${GITEA_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"title": "MR 제목",
"body": "MR 본문",
"head": "현재브랜치",
"base": "타겟브랜치"
}'
```
### 5. 결과 출력
- MR URL 출력
- 리뷰어 지정 안내
- 다음 단계: 리뷰 대기 → 승인 → 머지
## 필요 환경변수
- `GITEA_TOKEN`: Gitea API 접근 토큰 (없으면 안내)

파일 보기

@ -0,0 +1,49 @@
---
name: fix-issue
description: Gitea 이슈를 분석하고 수정 브랜치를 생성합니다
allowed-tools: "Bash, Read, Write, Edit, Glob, Grep"
argument-hint: "<issue-number>"
---
Gitea 이슈 #$ARGUMENTS 를 분석하고 수정 작업을 시작합니다.
## 수행 단계
### 1. 이슈 조회
```bash
curl -s "GITEA_URL/api/v1/repos/{owner}/{repo}/issues/$ARGUMENTS" \
-H "Authorization: token ${GITEA_TOKEN}"
```
- 이슈 제목, 본문, 라벨, 담당자 정보 확인
- 이슈 내용을 사용자에게 요약하여 보여줌
### 2. 브랜치 생성
이슈 라벨에 따라 브랜치 타입 결정:
- `bug` 라벨 → `bugfix/ISSUE-번호-설명`
- 그 외 → `feature/ISSUE-번호-설명`
- 긴급 → `hotfix/ISSUE-번호-설명`
```bash
git checkout develop
git pull origin develop
git checkout -b {type}/ISSUE-{number}-{slug}
```
### 3. 이슈 분석
이슈 내용을 바탕으로:
- 관련 파일 탐색 (Grep, Glob 활용)
- 영향 범위 파악
- 수정 방향 제안
### 4. 수정 계획 제시
사용자에게 수정 계획을 보여주고 승인을 받은 후 작업 진행:
- 수정할 파일 목록
- 변경 내용 요약
- 예상 영향
### 5. 작업 완료 후
- 변경 사항 요약
- `/create-mr` 실행 안내
## 필요 환경변수
- `GITEA_TOKEN`: Gitea API 접근 토큰

파일 보기

@ -0,0 +1,246 @@
---
name: init-project
description: 팀 표준 워크플로우로 프로젝트를 초기화합니다
allowed-tools: "Bash, Read, Write, Edit, Glob, Grep"
argument-hint: "[project-type: java-maven|java-gradle|react-ts|auto]"
---
팀 표준 워크플로우에 따라 프로젝트를 초기화합니다.
프로젝트 타입: $ARGUMENTS (기본: auto — 자동 감지)
## 프로젝트 타입 자동 감지
$ARGUMENTS가 "auto"이거나 비어있으면 다음 순서로 감지:
1. `pom.xml` 존재 → **java-maven**
2. `build.gradle` 또는 `build.gradle.kts` 존재 → **java-gradle**
3. `package.json` + `tsconfig.json` 존재 → **react-ts**
4. 감지 실패 → 사용자에게 타입 선택 요청
## 수행 단계
### 1. 프로젝트 분석
- 빌드 파일, 설정 파일, 디렉토리 구조 파악
- 사용 중인 프레임워크, 라이브러리 감지
- 기존 `.claude/` 디렉토리 존재 여부 확인
- eslint, prettier, checkstyle, spotless 등 lint 도구 설치 여부 확인
### 2. CLAUDE.md 생성
프로젝트 루트에 CLAUDE.md를 생성하고 다음 내용 포함:
- 프로젝트 개요 (이름, 타입, 주요 기술 스택)
- 빌드/실행 명령어 (감지된 빌드 도구 기반)
- 테스트 실행 명령어
- lint 실행 명령어 (감지된 도구 기반)
- 프로젝트 디렉토리 구조 요약
- 팀 컨벤션 참조 (`.claude/rules/` 안내)
### Gitea 파일 다운로드 URL 패턴
⚠️ Gitea raw 파일은 반드시 **web raw URL**을 사용해야 합니다 (`/api/v1/` 경로 사용 불가):
```bash
GITEA_URL="${GITEA_URL:-https://gitea.gc-si.dev}"
# common 파일: ${GITEA_URL}/gc/template-common/raw/branch/develop/<파일경로>
# 타입별 파일: ${GITEA_URL}/gc/template-<타입>/raw/branch/develop/<파일경로>
# 예시:
curl -sf "${GITEA_URL}/gc/template-common/raw/branch/develop/.claude/rules/team-policy.md"
curl -sf "${GITEA_URL}/gc/template-react-ts/raw/branch/develop/.editorconfig"
```
### 3. .claude/ 디렉토리 구성
이미 팀 표준 파일이 존재하면 건너뜀. 없는 경우 위의 URL 패턴으로 Gitea에서 다운로드:
- `.claude/settings.json` — 프로젝트 타입별 표준 권한 설정 + hooks 섹션 (4단계 참조)
- `.claude/rules/` — 팀 규칙 파일 (team-policy, git-workflow, code-style, naming, testing)
- `.claude/skills/` — 팀 스킬 (create-mr, fix-issue, sync-team-workflow, init-project)
### 4. Hook 스크립트 생성
`.claude/scripts/` 디렉토리를 생성하고 다음 스크립트 파일 생성 (chmod +x):
- `.claude/scripts/on-pre-compact.sh`:
```bash
#!/bin/bash
# PreCompact hook: systemMessage만 지원 (hookSpecificOutput 사용 불가)
INPUT=$(cat)
cat <<RESP
{
"systemMessage": "컨텍스트 압축이 시작됩니다. 반드시 다음을 수행하세요:\n\n1. memory/MEMORY.md - 핵심 작업 상태 갱신 (200줄 이내)\n2. memory/project-snapshot.md - 변경된 패키지/타입 정보 업데이트\n3. memory/project-history.md - 이번 세션 변경사항 추가\n4. memory/api-types.md - API 인터페이스 변경이 있었다면 갱신\n5. 미완료 작업이 있다면 TodoWrite에 남기고 memory에도 기록"
}
RESP
```
- `.claude/scripts/on-post-compact.sh`:
```bash
#!/bin/bash
INPUT=$(cat)
CWD=$(echo "$INPUT" | python3 -c "import sys,json;print(json.load(sys.stdin).get('cwd',''))" 2>/dev/null || echo "")
if [ -z "$CWD" ]; then
CWD=$(pwd)
fi
PROJECT_HASH=$(echo "$CWD" | sed 's|/|-|g')
MEMORY_DIR="$HOME/.claude/projects/$PROJECT_HASH/memory"
CONTEXT=""
if [ -f "$MEMORY_DIR/MEMORY.md" ]; then
SUMMARY=$(head -100 "$MEMORY_DIR/MEMORY.md" | python3 -c "import sys;print(sys.stdin.read().replace('\\\\','\\\\\\\\').replace('\"','\\\\\"').replace('\n','\\\\n'))" 2>/dev/null)
CONTEXT="컨텍스트가 압축되었습니다.\\n\\n[세션 요약]\\n${SUMMARY}"
fi
if [ -f "$MEMORY_DIR/project-snapshot.md" ]; then
SNAP=$(head -50 "$MEMORY_DIR/project-snapshot.md" | python3 -c "import sys;print(sys.stdin.read().replace('\\\\','\\\\\\\\').replace('\"','\\\\\"').replace('\n','\\\\n'))" 2>/dev/null)
CONTEXT="${CONTEXT}\\n\\n[프로젝트 최신 상태]\\n${SNAP}"
fi
if [ -n "$CONTEXT" ]; then
CONTEXT="${CONTEXT}\\n\\n위 내용을 참고하여 작업을 이어가세요. 상세 내용은 memory/ 디렉토리의 각 파일을 참조하세요."
echo "{\"hookSpecificOutput\":{\"additionalContext\":\"${CONTEXT}\"}}"
else
echo "{\"hookSpecificOutput\":{\"additionalContext\":\"컨텍스트가 압축되었습니다. memory 파일이 없으므로 사용자에게 이전 작업 내용을 확인하세요.\"}}"
fi
```
- `.claude/scripts/on-commit.sh`:
```bash
#!/bin/bash
INPUT=$(cat)
COMMAND=$(echo "$INPUT" | python3 -c "import sys,json;print(json.load(sys.stdin).get('tool_input',{}).get('command',''))" 2>/dev/null || echo "")
if echo "$COMMAND" | grep -qE 'git commit'; then
cat <<RESP
{
"hookSpecificOutput": {
"additionalContext": "커밋이 감지되었습니다. 다음을 수행하세요:\n1. docs/CHANGELOG.md에 변경 내역 추가\n2. memory/project-snapshot.md에서 변경된 부분 업데이트\n3. memory/project-history.md에 이번 변경사항 추가\n4. API 인터페이스 변경 시 memory/api-types.md 갱신\n5. 프로젝트에 lint 설정이 있다면 lint 결과를 확인하고 문제를 수정"
}
}
RESP
else
echo '{}'
fi
```
`.claude/settings.json`에 hooks 섹션이 없으면 추가 (기존 settings.json의 내용에 병합):
```json
{
"hooks": {
"SessionStart": [
{
"matcher": "compact",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-post-compact.sh",
"timeout": 10
}
]
}
],
"PreCompact": [
{
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-pre-compact.sh",
"timeout": 30
}
]
}
],
"PostToolUse": [
{
"matcher": "Bash",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-commit.sh",
"timeout": 15
}
]
}
]
}
}
```
### 5. Git Hooks 설정
```bash
git config core.hooksPath .githooks
```
`.githooks/` 디렉토리에 실행 권한 부여:
```bash
chmod +x .githooks/*
```
### 6. 프로젝트 타입별 추가 설정
#### java-maven
- `.sdkmanrc` 생성 (java=17.0.18-amzn 또는 프로젝트에 맞는 버전)
- `.mvn/settings.xml` Nexus 미러 설정 확인
- `mvn compile` 빌드 성공 확인
#### java-gradle
- `.sdkmanrc` 생성
- `gradle.properties.example` Nexus 설정 확인
- `./gradlew compileJava` 빌드 성공 확인
#### react-ts
- `.node-version` 생성 (프로젝트에 맞는 Node 버전)
- `.npmrc` Nexus 레지스트리 설정 확인
- `npm install && npm run build` 성공 확인
### 7. .gitignore 확인
다음 항목이 .gitignore에 포함되어 있는지 확인하고, 없으면 추가:
```
.claude/settings.local.json
.claude/CLAUDE.local.md
.env
.env.*
*.local
```
### 8. Git exclude 설정
`.git/info/exclude` 파일을 읽고, 기존 내용을 보존하면서 하단에 추가:
```gitignore
# Claude Code 워크플로우 (로컬 전용)
docs/CHANGELOG.md
*.tmp
```
### 9. Memory 초기화
프로젝트 memory 디렉토리의 위치를 확인하고 (보통 `~/.claude/projects/<project-hash>/memory/`) 다음 파일들을 생성:
- `memory/MEMORY.md` — 프로젝트 분석 결과 기반 핵심 요약 (200줄 이내)
- 현재 상태, 프로젝트 개요, 기술 스택, 주요 패키지 구조, 상세 참조 링크
- `memory/project-snapshot.md` — 디렉토리 구조, 패키지 구성, 주요 의존성, API 엔드포인트
- `memory/project-history.md` — "초기 팀 워크플로우 구성" 항목으로 시작
- `memory/api-types.md` — 주요 인터페이스/DTO/Entity 타입 요약
- `memory/decisions.md` — 빈 템플릿 (# 의사결정 기록)
- `memory/debugging.md` — 빈 템플릿 (# 디버깅 경험 & 패턴)
### 10. Lint 도구 확인
- TypeScript: eslint, prettier 설치 여부 확인. 미설치 시 사용자에게 설치 제안
- Java: checkstyle, spotless 등 설정 확인
- CLAUDE.md에 lint 실행 명령어가 이미 기록되었는지 확인
### 11. workflow-version.json 생성
Gitea API로 최신 팀 워크플로우 버전을 조회:
```bash
curl -sf --max-time 5 "https://gitea.gc-si.dev/gc/template-common/raw/branch/develop/workflow-version.json"
```
조회 성공 시 해당 `version` 값 사용, 실패 시 "1.0.0" 기본값 사용.
`.claude/workflow-version.json` 파일 생성:
```json
{
"applied_global_version": "<조회된 버전>",
"applied_date": "<현재날짜>",
"project_type": "<감지된타입>",
"gitea_url": "https://gitea.gc-si.dev"
}
```
### 12. 검증 및 요약
- 생성/수정된 파일 목록 출력
- `git config core.hooksPath` 확인
- 빌드 명령 실행 가능 확인
- Hook 스크립트 실행 권한 확인
- 다음 단계 안내:
- 개발 시작, 첫 커밋 방법
- 범용 스킬: `/api-registry`, `/changelog`, `/swagger-spec`

파일 보기

@ -0,0 +1,98 @@
---
name: sync-team-workflow
description: 팀 글로벌 워크플로우를 현재 프로젝트에 동기화합니다
allowed-tools: "Bash, Read, Write, Edit, Glob, Grep"
---
팀 글로벌 워크플로우의 최신 버전을 현재 프로젝트에 적용합니다.
## 수행 절차
### 1. 글로벌 버전 조회
Gitea API로 template-common 리포의 workflow-version.json 조회:
```bash
GITEA_URL=$(python3 -c "import json; print(json.load(open('.claude/workflow-version.json')).get('gitea_url', 'https://gitea.gc-si.dev'))" 2>/dev/null || echo "https://gitea.gc-si.dev")
curl -sf "${GITEA_URL}/gc/template-common/raw/branch/develop/workflow-version.json"
```
### 2. 버전 비교
로컬 `.claude/workflow-version.json``applied_global_version` 필드와 비교:
- 버전 일치 → "최신 버전입니다" 안내 후 종료
- 버전 불일치 → 미적용 변경 항목 추출하여 표시
### 3. 프로젝트 타입 감지
자동 감지 순서:
1. `.claude/workflow-version.json``project_type` 필드 확인
2. 없으면: `pom.xml` → java-maven, `build.gradle` → java-gradle, `package.json` → react-ts
### Gitea 파일 다운로드 URL 패턴
⚠️ Gitea raw 파일은 반드시 **web raw URL**을 사용해야 합니다 (`/api/v1/` 경로 사용 불가):
```bash
GITEA_URL="${GITEA_URL:-https://gitea.gc-si.dev}"
# common 파일: ${GITEA_URL}/gc/template-common/raw/branch/develop/<파일경로>
# 타입별 파일: ${GITEA_URL}/gc/template-<타입>/raw/branch/develop/<파일경로>
# 예시:
curl -sf "${GITEA_URL}/gc/template-common/raw/branch/develop/.claude/rules/team-policy.md"
curl -sf "${GITEA_URL}/gc/template-react-ts/raw/branch/develop/.editorconfig"
```
### 4. 파일 다운로드 및 적용
위의 URL 패턴으로 해당 타입 + common 템플릿 파일 다운로드:
#### 4-1. 규칙 파일 (덮어쓰기)
팀 규칙은 로컬 수정 불가 — 항상 글로벌 최신으로 교체:
```
.claude/rules/team-policy.md
.claude/rules/git-workflow.md
.claude/rules/code-style.md (타입별)
.claude/rules/naming.md (타입별)
.claude/rules/testing.md (타입별)
```
#### 4-2. settings.json (부분 갱신)
- `deny` 목록: 글로벌 최신으로 교체
- `allow` 목록: 기존 사용자 커스텀 유지 + 글로벌 기본값 병합
- `hooks`: init-project SKILL.md의 hooks JSON 블록을 참조하여 교체 (없으면 추가)
- SessionStart(compact) → on-post-compact.sh
- PreCompact → on-pre-compact.sh
- PostToolUse(Bash) → on-commit.sh
#### 4-3. 스킬 파일 (덮어쓰기)
```
.claude/skills/create-mr/SKILL.md
.claude/skills/fix-issue/SKILL.md
.claude/skills/sync-team-workflow/SKILL.md
.claude/skills/init-project/SKILL.md
```
#### 4-4. Git Hooks (덮어쓰기 + 실행 권한)
```bash
chmod +x .githooks/*
```
#### 4-5. Hook 스크립트 갱신
init-project SKILL.md의 코드 블록에서 최신 스크립트를 추출하여 덮어쓰기:
```
.claude/scripts/on-pre-compact.sh
.claude/scripts/on-post-compact.sh
.claude/scripts/on-commit.sh
```
실행 권한 부여: `chmod +x .claude/scripts/*.sh`
### 5. 로컬 버전 업데이트
`.claude/workflow-version.json` 갱신:
```json
{
"applied_global_version": "새버전",
"applied_date": "오늘날짜",
"project_type": "감지된타입",
"gitea_url": "https://gitea.gc-si.dev"
}
```
### 6. 변경 보고
- `git diff`로 변경 내역 확인
- 업데이트된 파일 목록 출력
- 변경 로그(글로벌 workflow-version.json의 changes) 표시
- 필요한 추가 조치 안내 (빌드 확인, 의존성 업데이트 등)

파일 보기

@ -0,0 +1,6 @@
{
"applied_global_version": "1.2.0",
"applied_date": "2026-02-14",
"project_type": "java-maven",
"gitea_url": "https://gitea.gc-si.dev"
}

33
.editorconfig Normal file
파일 보기

@ -0,0 +1,33 @@
root = true
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
[*.{java,kt}]
indent_style = space
indent_size = 4
[*.{js,jsx,ts,tsx,json,yml,yaml,css,scss,html}]
indent_style = space
indent_size = 2
[*.md]
trim_trailing_whitespace = false
[*.{sh,bash}]
indent_style = space
indent_size = 4
[Makefile]
indent_style = tab
[*.{gradle,groovy}]
indent_style = space
indent_size = 4
[*.xml]
indent_style = space
indent_size = 4

60
.githooks/commit-msg Executable file
파일 보기

@ -0,0 +1,60 @@
#!/bin/bash
#==============================================================================
# commit-msg hook
# Conventional Commits 형식 검증 (한/영 혼용 지원)
#==============================================================================
COMMIT_MSG_FILE="$1"
COMMIT_MSG=$(cat "$COMMIT_MSG_FILE")
# Merge 커밋은 검증 건너뜀
if echo "$COMMIT_MSG" | head -1 | grep -qE "^Merge "; then
exit 0
fi
# Revert 커밋은 검증 건너뜀
if echo "$COMMIT_MSG" | head -1 | grep -qE "^Revert "; then
exit 0
fi
# Conventional Commits 정규식
# type(scope): subject
# - type: feat|fix|docs|style|refactor|test|chore|ci|perf (필수)
# - scope: 영문, 숫자, 한글, 점, 밑줄, 하이픈 허용 (선택)
# - subject: 1~72자, 한/영 혼용 허용 (필수)
PATTERN='^(feat|fix|docs|style|refactor|test|chore|ci|perf)(\([a-zA-Z0-9가-힣._-]+\))?: .{1,72}$'
FIRST_LINE=$(head -1 "$COMMIT_MSG_FILE")
if ! echo "$FIRST_LINE" | grep -qE "$PATTERN"; then
echo ""
echo "╔══════════════════════════════════════════════════════════════╗"
echo "║ 커밋 메시지가 Conventional Commits 형식에 맞지 않습니다 ║"
echo "╚══════════════════════════════════════════════════════════════╝"
echo ""
echo " 올바른 형식: type(scope): subject"
echo ""
echo " type (필수):"
echo " feat — 새로운 기능"
echo " fix — 버그 수정"
echo " docs — 문서 변경"
echo " style — 코드 포맷팅"
echo " refactor — 리팩토링"
echo " test — 테스트"
echo " chore — 빌드/설정 변경"
echo " ci — CI/CD 변경"
echo " perf — 성능 개선"
echo ""
echo " scope (선택): 한/영 모두 가능"
echo " subject (필수): 1~72자, 한/영 모두 가능"
echo ""
echo " 예시:"
echo " feat(auth): JWT 기반 로그인 구현"
echo " fix(배치): 야간 배치 타임아웃 수정"
echo " docs: README 업데이트"
echo " chore: Gradle 의존성 업데이트"
echo ""
echo " 현재 메시지: $FIRST_LINE"
echo ""
exit 1
fi

25
.githooks/post-checkout Executable file
파일 보기

@ -0,0 +1,25 @@
#!/bin/bash
#==============================================================================
# post-checkout hook
# 브랜치 체크아웃 시 core.hooksPath 자동 설정
# clone/checkout 후 .githooks 디렉토리가 있으면 자동으로 hooksPath 설정
#==============================================================================
# post-checkout 파라미터: prev_HEAD, new_HEAD, branch_flag
# branch_flag=1: 브랜치 체크아웃, 0: 파일 체크아웃
BRANCH_FLAG="$3"
# 파일 체크아웃은 건너뜀
if [ "$BRANCH_FLAG" = "0" ]; then
exit 0
fi
# .githooks 디렉토리 존재 확인
REPO_ROOT=$(git rev-parse --show-toplevel 2>/dev/null)
if [ -d "${REPO_ROOT}/.githooks" ]; then
CURRENT_HOOKS_PATH=$(git config core.hooksPath 2>/dev/null || echo "")
if [ "$CURRENT_HOOKS_PATH" != ".githooks" ]; then
git config core.hooksPath .githooks
chmod +x "${REPO_ROOT}/.githooks/"* 2>/dev/null
fi
fi

33
.githooks/pre-commit Executable file
파일 보기

@ -0,0 +1,33 @@
#!/bin/bash
#==============================================================================
# pre-commit hook (Java Maven)
# Maven 컴파일 검증 — 컴파일 실패 시 커밋 차단
#==============================================================================
echo "pre-commit: Maven 컴파일 검증 중..."
# Maven Wrapper 사용 (없으면 mvn 사용)
if [ -f "./mvnw" ]; then
MVN="./mvnw"
elif command -v mvn &>/dev/null; then
MVN="mvn"
else
echo "경고: Maven이 설치되지 않았습니다. 컴파일 검증을 건너뜁니다."
exit 0
fi
# 컴파일 검증 (테스트 제외, 오프라인 가능)
$MVN compile -q -DskipTests 2>&1
RESULT=$?
if [ $RESULT -ne 0 ]; then
echo ""
echo "╔══════════════════════════════════════════════════════════╗"
echo "║ 컴파일 실패! 커밋이 차단되었습니다. ║"
echo "║ 컴파일 오류를 수정한 후 다시 커밋해주세요. ║"
echo "╚══════════════════════════════════════════════════════════╝"
echo ""
exit 1
fi
echo "pre-commit: 컴파일 성공"

11
.gitignore vendored
파일 보기

@ -93,13 +93,8 @@ application-local.yml
# Logs
logs/
docs/
*.log.*
# Session continuity files (for AI assistants)
.claude/
CLAUDE.md
BASEREADER_ENHANCEMENT_PLAN.md
README.md
nul
# Claude Code (개인 파일만 무시, 팀 파일은 추적)
.claude/settings.local.json
.claude/scripts/

22
.mvn/settings.xml Normal file
파일 보기

@ -0,0 +1,22 @@
<?xml version="1.0" encoding="UTF-8"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.2.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.2.0
https://maven.apache.org/xsd/settings-1.2.0.xsd">
<servers>
<server>
<id>nexus</id>
<username>admin</username>
<password>Gcsc!8932</password>
</server>
</servers>
<mirrors>
<mirror>
<id>nexus</id>
<name>GC Nexus Repository</name>
<url>https://nexus.gc-si.dev/repository/maven-public/</url>
<mirrorOf>*</mirrorOf>
</mirror>
</mirrors>
</settings>

1
.sdkmanrc Normal file
파일 보기

@ -0,0 +1 @@
java=17.0.18-amzn

101
CLAUDE.md Normal file
파일 보기

@ -0,0 +1,101 @@
# SNP-Batch-1 (snp-batch-validation)
해양 데이터 통합 배치 시스템. 외부 Maritime API에서 선박/항만/사건 데이터를 수집하여 PostgreSQL에 저장하고, AIS 실시간 위치정보를 캐시 기반으로 서비스.
## 기술 스택
- Java 17, Spring Boot 3.2.1, Spring Batch 5.1.0
- PostgreSQL (스키마: t_std_snp_data)
- Quartz Scheduler (JDBC Store)
- Spring Kafka (AIS Target → Kafka 파이프라인)
- WebFlux WebClient (외부 API 호출)
- Thymeleaf (배치 관리 Web GUI)
- Springdoc OpenAPI 2.3.0 (Swagger)
- Caffeine Cache, JTS (공간 연산)
- Lombok, Jackson
## 빌드 & 실행
```bash
# 빌드
sdk use java 17.0.18-amzn
mvn clean package -DskipTests
# 실행
mvn spring-boot:run
# 테스트
mvn test
```
## 서버 설정
- 포트: 8041
- Context Path: /snp-api
- Swagger UI: http://localhost:8041/snp-api/swagger-ui/index.html
## 디렉토리 구조
```
src/main/java/com/snp/batch/
├── SnpBatchApplication.java # 메인 애플리케이션
├── common/ # 공통 프레임워크
│ ├── batch/ # 배치 베이스 클래스 (config, entity, processor, reader, writer)
│ ├── util/ # 유틸 (JsonChangeDetector, SafeGetDataUtil)
│ └── web/ # Web 베이스 (ApiResponse, BaseController, BaseService)
├── global/ # 글로벌 설정 & 배치 관리
│ ├── config/ # AsyncConfig, QuartzConfig, SwaggerConfig, WebClientConfig
│ ├── controller/ # BatchController (/api/batch), WebViewController
│ ├── dto/ # Dashboard, JobExecution, Schedule DTO
│ ├── model/ # BatchLastExecution, JobScheduleEntity
│ ├── partition/ # 파티션 관리 (PartitionManagerTasklet)
│ ├── projection/ # DateRangeProjection
│ └── repository/ # BatchApiLog, BatchLastExecution, JobSchedule, Timeline
├── jobs/ # 배치 Job 모듈 (도메인별)
│ ├── aistarget/ # AIS Target (실시간 위치 + 캐시 + REST API + Kafka 발행)
│ ├── aistargetdbsync/ # AIS Target DB Sync (캐시→DB)
│ ├── common/ # 공통코드 (FlagCode, Stat5Code)
│ ├── compliance/ # 규정준수 (Compliance, CompanyCompliance)
│ ├── event/ # 해양사건 (Event, EventDetail, Cargo, HumanCasualty)
│ ├── movements/ # 선박이동 (다수 하위 Job)
│ ├── psc/ # PSC 검사
│ ├── risk/ # 리스크 분석
│ └── ship*/ # 선박정보 (ship001~ship028, 30+ 테이블)
└── service/ # BatchService, ScheduleService
```
## 배치 Job 패턴
각 Job은 `common/batch/` 베이스 클래스를 상속:
- **BaseJobConfig** → Job/Step 설정 (chunk-oriented)
- **BaseApiReader** → 외부 Maritime API 호출 (WebClient)
- **BaseProcessor** → DTO→Entity 변환
- **BaseWriter** → PostgreSQL Upsert
- **BaseEntity** → 공통 필드 (dataHash, lastModifiedDate 등)
## 주요 API 경로 (context-path: /snp-api)
### Batch Management (/api/batch)
| 메서드 | 경로 | 설명 |
|--------|------|------|
| POST | /jobs/{jobName}/execute | 배치 작업 실행 |
| GET | /jobs | 작업 목록 |
| GET | /jobs/{jobName}/executions | 실행 이력 |
| GET | /executions/{id}/detail | 실행 상세 (Step 포함) |
| POST | /executions/{id}/stop | 실행 중지 |
| GET/POST | /schedules | 스케줄 관리 (CRUD) |
| GET | /dashboard | 대시보드 |
| GET | /timeline | 타임라인 |
### AIS Target (/api/ais-target)
| 메서드 | 경로 | 설명 |
|--------|------|------|
| GET | /{mmsi} | MMSI로 최신 위치 |
| POST | /batch | 다건 MMSI 조회 |
| GET/POST | /search | 시간/공간 범위 검색 |
| POST | /search/filter | 조건 필터 검색 (SOG, COG 등) |
| POST | /search/polygon | 폴리곤 범위 검색 |
| POST | /search/wkt | WKT 형식 검색 |
| GET | /search/with-distance | 거리 포함 원형 검색 |
| GET | /{mmsi}/track | 항적 조회 |
| GET | /cache/stats | 캐시 통계 |
| DELETE | /cache | 캐시 초기화 |
## Lint/Format
- 별도 lint 도구 미설정 (checkstyle, spotless 없음)
- IDE 기본 포매터 사용

파일 크기가 너무 크기때문에 변경 상태를 표시하지 않습니다. Load Diff

파일 보기

@ -1,517 +0,0 @@
# Swagger API 문서화 가이드
**작성일**: 2025-10-16
**버전**: 1.0.0
**프로젝트**: SNP Batch - Spring Batch 기반 데이터 통합 시스템
---
## 📋 Swagger 설정 완료 사항
### ✅ 수정 완료 파일
1. **BaseController.java** - 공통 CRUD Controller 추상 클래스
- Java import alias 오류 수정 (`as SwaggerApiResponse` 제거)
- `@Operation` 어노테이션 내 `responses` 속성으로 통합
- 전체 경로로 어노테이션 사용: `@io.swagger.v3.oas.annotations.responses.ApiResponse`
2. **ProductWebController.java** - 샘플 제품 API Controller
- Java import alias 오류 수정
- 커스텀 엔드포인트 Swagger 어노테이션 수정
3. **SwaggerConfig.java** - Swagger/OpenAPI 3.0 설정
- 서버 포트 동적 설정 (`@Value("${server.port:8081}")`)
- 상세한 API 문서 설명 추가
- Markdown 형식 설명 추가
4. **BatchController.java** - 배치 관리 API (이미 올바르게 구현됨)
---
## 🌐 Swagger UI 접속 정보
### 접속 URL
```
Swagger UI: http://localhost:8081/swagger-ui/index.html
API 문서 (JSON): http://localhost:8081/v3/api-docs
API 문서 (YAML): http://localhost:8081/v3/api-docs.yaml
```
### 제공되는 API 그룹
> **참고**: BaseController는 추상 클래스이므로 별도의 API 그룹으로 표시되지 않습니다.
> 상속받는 Controller(예: ProductWebController)의 `@Tag`로 모든 CRUD 엔드포인트가 그룹화됩니다.
#### 1. **Batch Management API** (`/api/batch`)
배치 작업 실행 및 스케줄 관리
**엔드포인트**:
- `POST /api/batch/jobs/{jobName}/execute` - 배치 작업 실행
- `GET /api/batch/jobs` - 배치 작업 목록 조회
- `GET /api/batch/jobs/{jobName}/executions` - 실행 이력 조회
- `POST /api/batch/executions/{executionId}/stop` - 실행 중지
- `GET /api/batch/schedules` - 스케줄 목록 조회
- `POST /api/batch/schedules` - 스케줄 생성
- `PUT /api/batch/schedules/{jobName}` - 스케줄 수정
- `DELETE /api/batch/schedules/{jobName}` - 스케줄 삭제
- `PATCH /api/batch/schedules/{jobName}/toggle` - 스케줄 활성화/비활성화
- `GET /api/batch/dashboard` - 대시보드 데이터
- `GET /api/batch/timeline` - 타임라인 데이터
#### 2. **Product API** (`/api/products`)
샘플 제품 데이터 CRUD (BaseController 상속)
**모든 엔드포인트가 "Product API" 그룹으로 통합 표시됩니다.**
**공통 CRUD 엔드포인트** (BaseController에서 상속):
- `POST /api/products` - 제품 생성
- `GET /api/products/{id}` - 제품 조회 (ID)
- `GET /api/products` - 전체 제품 조회
- `GET /api/products/page?offset=0&limit=20` - 페이징 조회
- `PUT /api/products/{id}` - 제품 수정
- `DELETE /api/products/{id}` - 제품 삭제
- `GET /api/products/{id}/exists` - 존재 여부 확인
**커스텀 엔드포인트**:
- `GET /api/products/by-product-id/{productId}` - 제품 코드로 조회
- `GET /api/products/stats/active-count` - 활성 제품 개수
---
## 🛠️ 애플리케이션 실행 및 테스트
### 1. 애플리케이션 빌드 및 실행
```bash
# Maven 빌드 (IntelliJ IDEA에서)
mvn clean package -DskipTests
# 애플리케이션 실행
mvn spring-boot:run
```
또는 IntelliJ IDEA에서:
1. `SnpBatchApplication.java` 파일 열기
2. 메인 메서드 왼쪽의 ▶ 아이콘 클릭
3. "Run 'SnpBatchApplication'" 선택
### 2. Swagger UI 접속
브라우저에서 다음 URL 접속:
```
http://localhost:8081/swagger-ui/index.html
```
### 3. API 테스트 예시
#### 예시 1: 배치 작업 목록 조회
```http
GET http://localhost:8081/api/batch/jobs
```
**예상 응답**:
```json
[
"sampleProductImportJob",
"shipDataImportJob"
]
```
#### 예시 2: 배치 작업 실행
```http
POST http://localhost:8081/api/batch/jobs/sampleProductImportJob/execute
```
**예상 응답**:
```json
{
"success": true,
"message": "Job started successfully",
"executionId": 1
}
```
#### 예시 3: 제품 생성 (샘플)
```http
POST http://localhost:8081/api/products
Content-Type: application/json
{
"productId": "TEST-001",
"productName": "테스트 제품",
"category": "Electronics",
"price": 99.99,
"stockQuantity": 50,
"isActive": true,
"rating": 4.5
}
```
**예상 응답**:
```json
{
"success": true,
"message": "Product created successfully",
"data": {
"id": 1,
"productId": "TEST-001",
"productName": "테스트 제품",
"category": "Electronics",
"price": 99.99,
"stockQuantity": 50,
"isActive": true,
"rating": 4.5,
"createdAt": "2025-10-16T10:30:00",
"updatedAt": "2025-10-16T10:30:00"
}
}
```
#### 예시 4: 페이징 조회
```http
GET http://localhost:8081/api/products/page?offset=0&limit=10
```
**예상 응답**:
```json
{
"success": true,
"message": "Retrieved 10 items (total: 100)",
"data": [
{ "id": 1, "productName": "Product 1", ... },
{ "id": 2, "productName": "Product 2", ... },
...
]
}
```
---
## 📚 Swagger 어노테이션 가이드
### BaseController에서 사용된 패턴
#### ❌ 잘못된 사용법 (Java에서는 불가능)
```java
// Kotlin의 import alias는 Java에서 지원되지 않음
import io.swagger.v3.oas.annotations.responses.ApiResponse as SwaggerApiResponse;
@ApiResponses(value = {
@SwaggerApiResponse(responseCode = "200", description = "성공")
})
```
#### ✅ 올바른 사용법 (수정 완료)
```java
// import alias 제거
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.Parameter;
@Operation(
summary = "리소스 생성",
description = "새로운 리소스를 생성합니다",
responses = {
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "200",
description = "생성 성공"
),
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "500",
description = "서버 오류"
)
}
)
@PostMapping
public ResponseEntity<ApiResponse<D>> create(
@Parameter(description = "생성할 리소스 데이터", required = true)
@RequestBody D dto) {
// ...
}
```
### 주요 어노테이션 설명
#### 1. `@Tag` - API 그룹화
```java
@Tag(name = "Product API", description = "제품 관리 API")
public class ProductWebController extends BaseController<ProductWebDto, Long> {
// ...
}
```
#### 2. `@Operation` - 엔드포인트 문서화
```java
@Operation(
summary = "짧은 설명 (목록에 표시)",
description = "상세 설명 (확장 시 표시)",
responses = { /* 응답 정의 */ }
)
```
#### 3. `@Parameter` - 파라미터 설명
```java
@Parameter(
description = "파라미터 설명",
required = true,
example = "예시 값"
)
@PathVariable String id
```
#### 4. `@io.swagger.v3.oas.annotations.responses.ApiResponse` - 응답 정의
```java
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "200",
description = "성공 메시지",
content = @Content(
mediaType = "application/json",
schema = @Schema(implementation = ProductDto.class)
)
)
```
---
## 🎯 신규 Controller 개발 시 Swagger 적용 가이드
### 1. BaseController를 상속하는 경우
```java
@RestController
@RequestMapping("/api/myresource")
@RequiredArgsConstructor
@Tag(name = "My Resource API", description = "나의 리소스 관리 API")
public class MyResourceController extends BaseController<MyResourceDto, Long> {
private final MyResourceService myResourceService;
@Override
protected BaseService<?, MyResourceDto, Long> getService() {
return myResourceService;
}
@Override
protected String getResourceName() {
return "MyResource";
}
// BaseController가 제공하는 CRUD 엔드포인트 자동 생성:
// POST /api/myresource
// GET /api/myresource/{id}
// GET /api/myresource
// GET /api/myresource/page
// PUT /api/myresource/{id}
// DELETE /api/myresource/{id}
// GET /api/myresource/{id}/exists
// 커스텀 엔드포인트 추가 시:
@Operation(
summary = "커스텀 조회",
description = "특정 조건으로 리소스를 조회합니다",
responses = {
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "200",
description = "조회 성공"
)
}
)
@GetMapping("/custom/{key}")
public ResponseEntity<ApiResponse<MyResourceDto>> customEndpoint(
@Parameter(description = "커스텀 키", required = true)
@PathVariable String key) {
// 구현...
}
}
```
### 2. 독립적인 Controller를 작성하는 경우
```java
@RestController
@RequestMapping("/api/custom")
@RequiredArgsConstructor
@Slf4j
@Tag(name = "Custom API", description = "커스텀 API")
public class CustomController {
@Operation(
summary = "커스텀 작업",
description = "특정 작업을 수행합니다",
responses = {
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "200",
description = "작업 성공"
),
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "500",
description = "서버 오류"
)
}
)
@PostMapping("/action")
public ResponseEntity<Map<String, Object>> customAction(
@Parameter(description = "액션 파라미터", required = true)
@RequestBody Map<String, String> params) {
// 구현...
}
}
```
---
## 🔍 Swagger UI 화면 구성
### 메인 화면
```
┌─────────────────────────────────────────────────┐
│ SNP Batch REST API │
│ Version: v1.0.0 │
│ Spring Batch 기반 데이터 통합 시스템 REST API │
├─────────────────────────────────────────────────┤
│ Servers: │
│ ▼ http://localhost:8081 (로컬 개발 서버) │
├─────────────────────────────────────────────────┤
│ │
│ ▼ Batch Management API │
│ POST /api/batch/jobs/{jobName}/execute │
│ GET /api/batch/jobs │
│ ... │
│ │
│ ▼ Product API (9개 엔드포인트 통합 표시) │
│ POST /api/products │
│ GET /api/products/{id} │
│ GET /api/products │
│ GET /api/products/page │
│ PUT /api/products/{id} │
│ DELETE /api/products/{id} │
│ GET /api/products/{id}/exists │
│ GET /api/products/by-product-id/{...} │
│ GET /api/products/stats/active-count │
│ │
│ (Base API 그룹은 표시되지 않음) │
│ │
└─────────────────────────────────────────────────┘
```
### API 실행 화면 예시
각 엔드포인트 클릭 시:
- **Parameters**: 파라미터 입력 필드
- **Request body**: JSON 요청 본문 에디터
- **Try it out**: 실제 API 호출 버튼
- **Responses**: 응답 코드 및 예시
- **Curl**: curl 명령어 생성
---
## ⚠️ 문제 해결
### 1. Swagger UI 접속 불가
**증상**: `http://localhost:8081/swagger-ui/index.html` 접속 시 404 오류
**해결**:
1. 애플리케이션이 실행 중인지 확인
2. 포트 번호 확인 (`application.yml`의 `server.port`)
3. 다음 URL 시도:
- `http://localhost:8081/swagger-ui.html`
- `http://localhost:8081/swagger-ui/`
### 2. API 실행 시 401/403 오류
**증상**: "Try it out" 클릭 시 인증 오류
**해결**:
- 현재 인증이 설정되지 않음 (기본 허용)
- Spring Security 추가 시 Swagger 경로 허용 필요:
```java
.authorizeHttpRequests(auth -> auth
.requestMatchers("/swagger-ui/**", "/v3/api-docs/**").permitAll()
.anyRequest().authenticated()
)
```
### 3. 특정 엔드포인트가 보이지 않음
**증상**: Controller는 작성했지만 Swagger UI에 표시되지 않음
**해결**:
1. `@RestController` 어노테이션 확인
2. `@RequestMapping` 경로 확인
3. Controller가 `com.snp.batch` 패키지 하위에 있는지 확인
4. 애플리케이션 재시작
---
## 📊 설정 파일
### application.yml (Swagger 관련 설정)
```yaml
server:
port: 8081 # Swagger UI 접속 포트
# Springdoc OpenAPI 설정 (필요 시 추가)
springdoc:
api-docs:
path: /v3/api-docs # OpenAPI JSON 경로
swagger-ui:
path: /swagger-ui.html # Swagger UI 경로
enabled: true
operations-sorter: alpha # 엔드포인트 정렬 (alpha, method)
tags-sorter: alpha # 태그 정렬
```
---
## 🎓 추가 학습 자료
### Swagger 어노테이션 공식 문서
- [OpenAPI 3.0 Annotations](https://github.com/swagger-api/swagger-core/wiki/Swagger-2.X---Annotations)
- [Springdoc OpenAPI](https://springdoc.org/)
### 관련 파일 위치
```
src/main/java/com/snp/batch/
├── common/web/controller/BaseController.java # 공통 CRUD Base
├── global/config/SwaggerConfig.java # Swagger 설정
├── global/controller/BatchController.java # Batch API
└── jobs/sample/web/controller/ProductWebController.java # Product API
```
---
## ✅ 체크리스트
애플리케이션 실행 전 확인:
- [ ] Maven 빌드 성공
- [ ] `application.yml` 설정 확인
- [ ] PostgreSQL 데이터베이스 연결 확인
- [ ] 포트 8081 사용 가능 여부 확인
Swagger 테스트 확인:
- [ ] Swagger UI 접속 성공
- [ ] Batch Management API 표시 확인
- [ ] Product API 표시 확인
- [ ] "Try it out" 기능 동작 확인
- [ ] API 응답 정상 확인
---
## 📚 관련 문서
### 핵심 문서
- **[README.md](README.md)** - 프로젝트 개요 및 빠른 시작 가이드
- **[DEVELOPMENT_GUIDE.md](DEVELOPMENT_GUIDE.md)** - 신규 Job 개발 가이드 및 Base 클래스 사용법
- **[CLAUDE.md](CLAUDE.md)** - 프로젝트 형상관리 문서 (세션 연속성)
### 아키텍처 문서
- **[docs/architecture/ARCHITECTURE.md](docs/architecture/ARCHITECTURE.md)** - 프로젝트 아키텍처 상세 설계
- **[docs/architecture/PROJECT_STRUCTURE.md](docs/architecture/PROJECT_STRUCTURE.md)** - Job 중심 패키지 구조 가이드
### 구현 가이드
- **[docs/guides/PROXY_SERVICE_GUIDE.md](docs/guides/PROXY_SERVICE_GUIDE.md)** - 외부 API 프록시 패턴 구현 가이드
- **[docs/guides/SHIP_API_EXAMPLE.md](docs/guides/SHIP_API_EXAMPLE.md)** - Maritime API 연동 실전 예제
### 보안 문서
- **[docs/security/README.md](docs/security/README.md)** - 보안 전략 개요 (계획 단계)
---
**최종 업데이트**: 2025-10-16
**작성자**: Claude Code
**버전**: 1.1.0

파일 보기

@ -13,7 +13,7 @@
</parent>
<groupId>com.snp</groupId>
<artifactId>snp-batch</artifactId>
<artifactId>snp-batch-validation</artifactId>
<version>1.0.0</version>
<name>SNP Batch</name>
<description>Spring Batch project for JSON to PostgreSQL with Web GUI</description>
@ -111,6 +111,12 @@
<version>2.3.0</version>
</dependency>
<!-- Kafka -->
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<!-- Caffeine Cache -->
<dependency>
<groupId>com.github.ben-manes.caffeine</groupId>

파일 보기

@ -43,4 +43,22 @@ public abstract class BaseEntity {
* 컬럼: updated_by (VARCHAR(100))
*/
private String updatedBy;
/**
* 배치 실행 ID
* 컬럼: job_execution_id (int8)
*/
private Long jobExecutionId;
/**
* 배치 공통 필드 설정을 위한 편의 메서드
*/
@SuppressWarnings("unchecked")
public <T extends BaseEntity> T setBatchInfo(Long jobExecutionId, String createdBy) {
this.jobExecutionId = jobExecutionId;
this.createdBy = createdBy;
// 필요시 생성일시 강제 설정 (JPA Auditing을 경우)
if (this.createdAt == null) this.createdAt = LocalDateTime.now();
return (T) this;
}
}

파일 보기

@ -170,6 +170,125 @@ public abstract class BaseApiReader<T> implements ItemReader<T> {
}
}
protected <T, R> List<R> executeWrapperApiCall(
String baseUrl,
String path,
Class<T> responseWrapperClass, // Stat5CodeApiResponse.class 등을 받음
Function<T, List<R>> listExtractor, // 결과 객체에서 리스트를 꺼내는 로직
BatchApiLogService logService) {
String fullUri = UriComponentsBuilder.fromHttpUrl(baseUrl)
.path(path)
.build()
.toUriString();
long startTime = System.currentTimeMillis();
int statusCode = 200;
String errorMessage = null;
Long responseSize = 0L;
try {
log.info("[{}] API 요청 시작: {}", getReaderName(), fullUri);
// 1. List<R> 아닌 Wrapper 객체(T) 받아옵니다.
T response = webClient.get()
.uri(uriBuilder -> uriBuilder.path(path).build())
.retrieve()
.bodyToMono(responseWrapperClass)
.block();
// 2. 추출 함수(listExtractor) 사용하여 내부 리스트를 꺼냅니다.
List<R> result = (response != null) ? listExtractor.apply(response) : Collections.emptyList();
responseSize = (long) result.size();
return result;
} catch (WebClientResponseException e) {
statusCode = e.getStatusCode().value();
errorMessage = String.format("API Error: %s", e.getResponseBodyAsString());
throw e;
} catch (Exception e) {
statusCode = 500;
errorMessage = String.format("System Error: %s", e.getMessage());
throw e;
} finally {
long duration = System.currentTimeMillis() - startTime;
logService.saveLog(BatchApiLog.builder()
.apiRequestLocation(getReaderName())
.requestUri(fullUri)
.httpMethod("GET")
.statusCode(statusCode)
.responseTimeMs(duration)
.responseCount(responseSize)
.errorMessage(errorMessage)
.createdAt(LocalDateTime.now())
.jobExecutionId(this.jobExecutionId)
.stepExecutionId(this.stepExecutionId)
.build());
}
}
protected <R> List<R> executeListApiCall(
String baseUrl,
String path,
ParameterizedTypeReference<List<R>> typeReference,
BatchApiLogService logService) {
String fullUri = UriComponentsBuilder.fromHttpUrl(baseUrl)
.path(path)
.build()
.toUriString();
long startTime = System.currentTimeMillis();
int statusCode = 200;
String errorMessage = null;
Long responseSize = 0L;
try {
log.info("[{}] API 요청 시작: {}", getReaderName(), fullUri);
List<R> result = webClient.get()
.uri(uriBuilder -> {
uriBuilder.path(path);
return uriBuilder.build();
})
.retrieve()
.bodyToMono(typeReference)
.block();
responseSize = (result != null) ? (long) result.size() : 0L;
return result;
} catch (WebClientResponseException e) {
// API 서버에서 응답은 왔으나 에러인 경우 (4xx, 5xx)
statusCode = e.getStatusCode().value();
errorMessage = String.format("API Error: %s", e.getResponseBodyAsString());
throw e;
} catch (Exception e) {
// 네트워크 오류, 타임아웃 기타 예외
statusCode = 500;
errorMessage = String.format("System Error: %s", e.getMessage());
throw e;
} finally {
// 성공/실패 여부와 관계없이 무조건 로그 저장
long duration = System.currentTimeMillis() - startTime;
logService.saveLog(BatchApiLog.builder()
.apiRequestLocation(getReaderName())
.requestUri(fullUri)
.httpMethod("GET")
.statusCode(statusCode)
.responseTimeMs(duration)
.responseCount(responseSize)
.errorMessage(errorMessage)
.createdAt(LocalDateTime.now())
.jobExecutionId(this.jobExecutionId) // 추가
.stepExecutionId(this.stepExecutionId) // 추가
.build());
}
}
/**
* API 호출 로그 적재 통합 메서드
* Response Json 구조 : { "data": [...] }

파일 보기

@ -29,9 +29,23 @@ public abstract class BaseJdbcRepository<T, ID> {
protected final JdbcTemplate jdbcTemplate;
/**
* 테이블명 반환 (하위 클래스에서 구현)
* 대상 스키마 이름 반환 (하위 클래스에서 구현)
* application.yml의 app.batch.target-schema.name 값을 @Value로 주입받아 반환
*/
protected abstract String getTableName();
protected abstract String getTargetSchema();
/**
* 테이블명만 반환 (스키마 제외, 하위 클래스에서 구현)
*/
protected abstract String getSimpleTableName();
/**
* 전체 테이블명 반환 (스키마.테이블)
* 하위 클래스에서는 getSimpleTableName() 구현하면
*/
protected String getTableName() {
return getTargetSchema() + "." + getSimpleTableName();
}
/**
* ID 컬럼명 반환 (기본값: "id")

파일 보기

@ -7,7 +7,7 @@ import org.hibernate.annotations.CreationTimestamp;
import java.time.LocalDateTime;
@Entity
@Table(name = "batch_api_log", schema = "snp_data")
@Table(name = "batch_api_log", schema = "t_std_snp_data")
@Getter
@NoArgsConstructor(access = AccessLevel.PROTECTED)
@AllArgsConstructor

파일 보기

@ -1,8 +1,8 @@
package com.snp.batch.jobs.aistarget.batch.repository;
import com.snp.batch.jobs.aistarget.batch.entity.AisTargetEntity;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -19,104 +19,111 @@ import java.util.Optional;
/**
* AIS Target Repository 구현체
*
* 테이블: snp_data.ais_target
* 테이블: {targetSchema}.ais_target
* PK: mmsi + message_timestamp (복합키)
*/
@Slf4j
@Repository
@RequiredArgsConstructor
public class AisTargetRepositoryImpl implements AisTargetRepository {
private final JdbcTemplate jdbcTemplate;
private final String tableName;
private final String upsertSql;
private static final String TABLE_NAME = "snp_data.ais_target";
public AisTargetRepositoryImpl(JdbcTemplate jdbcTemplate,
@Value("${app.batch.target-schema.name}") String targetSchema) {
this.jdbcTemplate = jdbcTemplate;
this.tableName = targetSchema + ".ais_target";
this.upsertSql = buildUpsertSql(targetSchema);
}
// ==================== UPSERT SQL ====================
private String buildUpsertSql(String schema) {
return """
INSERT INTO %s.ais_target (
mmsi, message_timestamp, imo, name, callsign, vessel_type, extra_info,
lat, lon, geom,
heading, sog, cog, rot,
length, width, draught, length_bow, length_stern, width_port, width_starboard,
destination, eta, status,
age_minutes, position_accuracy, timestamp_utc, repeat_indicator, raim_flag,
radio_status, regional, regional2, spare, spare2,
ais_version, position_fix_type, dte, band_flag,
received_date, collected_at, created_at, updated_at,
tonnes_cargo, in_sts, on_berth, dwt, anomalous,
destination_port_id, destination_tidied, destination_unlocode, imo_verified, last_static_update_received,
lpc_code, message_type, "source", station_id, zone_id
) VALUES (
?, ?, ?, ?, ?, ?, ?,
?, ?, ST_SetSRID(ST_MakePoint(?, ?), 4326),
?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?,
?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?,
?, ?, NOW(), NOW(),
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?
)
ON CONFLICT (mmsi, message_timestamp) DO UPDATE SET
imo = EXCLUDED.imo,
name = EXCLUDED.name,
callsign = EXCLUDED.callsign,
vessel_type = EXCLUDED.vessel_type,
extra_info = EXCLUDED.extra_info,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
geom = EXCLUDED.geom,
heading = EXCLUDED.heading,
sog = EXCLUDED.sog,
cog = EXCLUDED.cog,
rot = EXCLUDED.rot,
length = EXCLUDED.length,
width = EXCLUDED.width,
draught = EXCLUDED.draught,
length_bow = EXCLUDED.length_bow,
length_stern = EXCLUDED.length_stern,
width_port = EXCLUDED.width_port,
width_starboard = EXCLUDED.width_starboard,
destination = EXCLUDED.destination,
eta = EXCLUDED.eta,
status = EXCLUDED.status,
age_minutes = EXCLUDED.age_minutes,
position_accuracy = EXCLUDED.position_accuracy,
timestamp_utc = EXCLUDED.timestamp_utc,
repeat_indicator = EXCLUDED.repeat_indicator,
raim_flag = EXCLUDED.raim_flag,
radio_status = EXCLUDED.radio_status,
regional = EXCLUDED.regional,
regional2 = EXCLUDED.regional2,
spare = EXCLUDED.spare,
spare2 = EXCLUDED.spare2,
ais_version = EXCLUDED.ais_version,
position_fix_type = EXCLUDED.position_fix_type,
dte = EXCLUDED.dte,
band_flag = EXCLUDED.band_flag,
received_date = EXCLUDED.received_date,
collected_at = EXCLUDED.collected_at,
updated_at = NOW(),
tonnes_cargo = EXCLUDED.tonnes_cargo,
in_sts = EXCLUDED.in_sts,
on_berth = EXCLUDED.on_berth,
dwt = EXCLUDED.dwt,
anomalous = EXCLUDED.anomalous,
destination_port_id = EXCLUDED.destination_port_id,
destination_tidied = EXCLUDED.destination_tidied,
destination_unlocode = EXCLUDED.destination_unlocode,
imo_verified = EXCLUDED.imo_verified,
last_static_update_received = EXCLUDED.last_static_update_received,
lpc_code = EXCLUDED.lpc_code,
message_type = EXCLUDED.message_type,
"source" = EXCLUDED."source",
station_id = EXCLUDED.station_id,
zone_id = EXCLUDED.zone_id
""".formatted(schema);
}
private static final String UPSERT_SQL = """
INSERT INTO snp_data.ais_target (
mmsi, message_timestamp, imo, name, callsign, vessel_type, extra_info,
lat, lon, geom,
heading, sog, cog, rot,
length, width, draught, length_bow, length_stern, width_port, width_starboard,
destination, eta, status,
age_minutes, position_accuracy, timestamp_utc, repeat_indicator, raim_flag,
radio_status, regional, regional2, spare, spare2,
ais_version, position_fix_type, dte, band_flag,
received_date, collected_at, created_at, updated_at,
tonnes_cargo, in_sts, on_berth, dwt, anomalous,
destination_port_id, destination_tidied, destination_unlocode, imo_verified, last_static_update_received,
lpc_code, message_type, "source", station_id, zone_id
) VALUES (
?, ?, ?, ?, ?, ?, ?,
?, ?, ST_SetSRID(ST_MakePoint(?, ?), 4326),
?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?,
?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?,
?, ?, NOW(), NOW(),
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?
)
ON CONFLICT (mmsi, message_timestamp) DO UPDATE SET
imo = EXCLUDED.imo,
name = EXCLUDED.name,
callsign = EXCLUDED.callsign,
vessel_type = EXCLUDED.vessel_type,
extra_info = EXCLUDED.extra_info,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
geom = EXCLUDED.geom,
heading = EXCLUDED.heading,
sog = EXCLUDED.sog,
cog = EXCLUDED.cog,
rot = EXCLUDED.rot,
length = EXCLUDED.length,
width = EXCLUDED.width,
draught = EXCLUDED.draught,
length_bow = EXCLUDED.length_bow,
length_stern = EXCLUDED.length_stern,
width_port = EXCLUDED.width_port,
width_starboard = EXCLUDED.width_starboard,
destination = EXCLUDED.destination,
eta = EXCLUDED.eta,
status = EXCLUDED.status,
age_minutes = EXCLUDED.age_minutes,
position_accuracy = EXCLUDED.position_accuracy,
timestamp_utc = EXCLUDED.timestamp_utc,
repeat_indicator = EXCLUDED.repeat_indicator,
raim_flag = EXCLUDED.raim_flag,
radio_status = EXCLUDED.radio_status,
regional = EXCLUDED.regional,
regional2 = EXCLUDED.regional2,
spare = EXCLUDED.spare,
spare2 = EXCLUDED.spare2,
ais_version = EXCLUDED.ais_version,
position_fix_type = EXCLUDED.position_fix_type,
dte = EXCLUDED.dte,
band_flag = EXCLUDED.band_flag,
received_date = EXCLUDED.received_date,
collected_at = EXCLUDED.collected_at,
updated_at = NOW(),
tonnes_cargo = EXCLUDED.tonnes_cargo,
in_sts = EXCLUDED.in_sts,
on_berth = EXCLUDED.on_berth,
dwt = EXCLUDED.dwt,
anomalous = EXCLUDED.anomalous,
destination_port_id = EXCLUDED.destination_port_id,
destination_tidied = EXCLUDED.destination_tidied,
destination_unlocode = EXCLUDED.destination_unlocode,
imo_verified = EXCLUDED.imo_verified,
last_static_update_received = EXCLUDED.last_static_update_received,
lpc_code = EXCLUDED.lpc_code,
message_type = EXCLUDED.message_type,
"source" = EXCLUDED."source",
station_id = EXCLUDED.station_id,
zone_id = EXCLUDED.zone_id
""";
// ==================== RowMapper ====================
@ -181,7 +188,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
@Override
public Optional<AisTargetEntity> findByMmsiAndMessageTimestamp(Long mmsi, OffsetDateTime messageTimestamp) {
String sql = "SELECT * FROM " + TABLE_NAME + " WHERE mmsi = ? AND message_timestamp = ?";
String sql = "SELECT * FROM " + tableName + " WHERE mmsi = ? AND message_timestamp = ?";
List<AisTargetEntity> results = jdbcTemplate.query(sql, rowMapper, mmsi, toTimestamp(messageTimestamp));
return results.isEmpty() ? Optional.empty() : Optional.of(results.get(0));
}
@ -193,7 +200,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
WHERE mmsi = ?
ORDER BY message_timestamp DESC
LIMIT 1
""".formatted(TABLE_NAME);
""".formatted(tableName);
List<AisTargetEntity> results = jdbcTemplate.query(sql, rowMapper, mmsi);
return results.isEmpty() ? Optional.empty() : Optional.of(results.get(0));
}
@ -210,7 +217,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
FROM %s
WHERE mmsi = ANY(?)
ORDER BY mmsi, message_timestamp DESC
""".formatted(TABLE_NAME);
""".formatted(tableName);
Long[] mmsiArray = mmsiList.toArray(new Long[0]);
return jdbcTemplate.query(sql, rowMapper, (Object) mmsiArray);
@ -223,7 +230,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
WHERE mmsi = ?
AND message_timestamp BETWEEN ? AND ?
ORDER BY message_timestamp ASC
""".formatted(TABLE_NAME);
""".formatted(tableName);
return jdbcTemplate.query(sql, rowMapper, mmsi, toTimestamp(start), toTimestamp(end));
}
@ -245,7 +252,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
?
)
ORDER BY mmsi, message_timestamp DESC
""".formatted(TABLE_NAME);
""".formatted(tableName);
return jdbcTemplate.query(sql, rowMapper,
toTimestamp(start), toTimestamp(end),
@ -261,7 +268,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
log.info("AIS Target 배치 UPSERT 시작: {} 건", entities.size());
jdbcTemplate.batchUpdate(UPSERT_SQL, entities, 1000, (ps, entity) -> {
jdbcTemplate.batchUpdate(upsertSql, entities, 1000, (ps, entity) -> {
int idx = 1;
// PK
ps.setLong(idx++, entity.getMmsi());
@ -336,7 +343,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
@Override
public long count() {
String sql = "SELECT COUNT(*) FROM " + TABLE_NAME;
String sql = "SELECT COUNT(*) FROM " + tableName;
Long count = jdbcTemplate.queryForObject(sql, Long.class);
return count != null ? count : 0L;
}
@ -344,7 +351,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
@Override
@Transactional
public int deleteOlderThan(OffsetDateTime threshold) {
String sql = "DELETE FROM " + TABLE_NAME + " WHERE message_timestamp < ?";
String sql = "DELETE FROM " + tableName + " WHERE message_timestamp < ?";
int deleted = jdbcTemplate.update(sql, toTimestamp(threshold));
log.info("AIS Target 오래된 데이터 삭제 완료: {} 건 (기준: {})", deleted, threshold);
return deleted;

파일 보기

@ -4,6 +4,7 @@ import com.snp.batch.common.batch.writer.BaseWriter;
import com.snp.batch.jobs.aistarget.batch.entity.AisTargetEntity;
import com.snp.batch.jobs.aistarget.cache.AisTargetCacheManager;
import com.snp.batch.jobs.aistarget.classifier.AisClassTypeClassifier;
import com.snp.batch.jobs.aistarget.kafka.AisTargetKafkaProducer;
import lombok.extern.slf4j.Slf4j;
import org.springframework.stereotype.Component;
@ -15,10 +16,11 @@ import java.util.List;
* 동작:
* 1. ClassType 분류 (Core20 캐시 기반 A/B 분류)
* 2. 캐시에 최신 위치 정보 업데이트 (classType, core20Mmsi 포함)
* 3. Kafka 토픽으로 AIS Target 정보 전송 (서브청크 분할)
*
* 참고:
* - DB 저장은 별도 Job(aisTargetDbSyncJob)에서 15분 주기로 수행
* - Writer는 캐시 업데이트만 담당
* - Kafka 전송 실패는 기본적으로 로그만 남기고 다음 처리 계속
*/
@Slf4j
@Component
@ -26,13 +28,16 @@ public class AisTargetDataWriter extends BaseWriter<AisTargetEntity> {
private final AisTargetCacheManager cacheManager;
private final AisClassTypeClassifier classTypeClassifier;
private final AisTargetKafkaProducer kafkaProducer;
public AisTargetDataWriter(
AisTargetCacheManager cacheManager,
AisClassTypeClassifier classTypeClassifier) {
AisClassTypeClassifier classTypeClassifier,
AisTargetKafkaProducer kafkaProducer) {
super("AisTarget");
this.cacheManager = cacheManager;
this.classTypeClassifier = classTypeClassifier;
this.kafkaProducer = kafkaProducer;
}
@Override
@ -48,5 +53,19 @@ public class AisTargetDataWriter extends BaseWriter<AisTargetEntity> {
log.debug("AIS Target 캐시 업데이트 완료: {} 건 (캐시 크기: {})",
items.size(), cacheManager.size());
// 3. Kafka 전송 (설정 enabled=true 경우)
if (!kafkaProducer.isEnabled()) {
log.debug("AIS Kafka 전송 비활성화 - topic 전송 스킵");
return;
}
AisTargetKafkaProducer.PublishSummary summary = kafkaProducer.publish(items);
log.info("AIS Kafka 전송 완료 - topic: {}, 요청: {}, 성공: {}, 실패: {}, 스킵: {}",
kafkaProducer.getTopic(),
summary.getRequestedCount(),
summary.getSuccessCount(),
summary.getFailedCount(),
summary.getSkippedCount());
}
}

파일 보기

@ -0,0 +1,55 @@
package com.snp.batch.jobs.aistarget.kafka;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.snp.batch.jobs.aistarget.batch.entity.AisTargetEntity;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.time.OffsetDateTime;
import java.time.ZoneOffset;
/**
* AIS Target Kafka 메시지 스키마
*/
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
@JsonInclude(JsonInclude.Include.NON_NULL)
public class AisTargetKafkaMessage {
/**
* 이벤트 고유 식별자
* - 형식: {mmsi}_{messageTimestamp}
*/
private String eventId;
/**
* Kafka key와 동일한 선박 식별자
*/
private String key;
/**
* Kafka 발행 시각(UTC)
*/
private OffsetDateTime publishedAt;
/**
* AIS 원본/가공 데이터 전체 필드
*/
private AisTargetEntity payload;
public static AisTargetKafkaMessage from(AisTargetEntity entity) {
String key = entity.getMmsi() != null ? String.valueOf(entity.getMmsi()) : null;
String messageTs = entity.getMessageTimestamp() != null ? entity.getMessageTimestamp().toString() : "null";
return AisTargetKafkaMessage.builder()
.eventId(key + "_" + messageTs)
.key(key)
.publishedAt(OffsetDateTime.now(ZoneOffset.UTC))
.payload(entity)
.build();
}
}

파일 보기

@ -0,0 +1,207 @@
package com.snp.batch.jobs.aistarget.kafka;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.jobs.aistarget.batch.entity.AisTargetEntity;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Component;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.atomic.AtomicInteger;
/**
* AIS Target Kafka Producer
*
* 정책:
* - key: MMSI
* - value: AisTargetKafkaMessage(JSON)
* - 실패 기본적으로 로그만 남기고 계속 진행 (failOnSendError=false)
*/
@Slf4j
@Component
@RequiredArgsConstructor
public class AisTargetKafkaProducer {
private final KafkaTemplate<String, String> kafkaTemplate;
private final ObjectMapper objectMapper;
private final AisTargetKafkaProperties kafkaProperties;
public boolean isEnabled() {
return kafkaProperties.isEnabled();
}
public String getTopic() {
return kafkaProperties.getTopic();
}
/**
* 수집 청크 데이터를 Kafka 전송용 서브청크로 분할해 전송
*/
public PublishSummary publish(List<AisTargetEntity> entities) {
if (!isEnabled()) {
return PublishSummary.disabled();
}
if (entities == null || entities.isEmpty()) {
return PublishSummary.empty();
}
int subChunkSize = Math.max(1, kafkaProperties.getSendChunkSize());
PublishSummary totalSummary = PublishSummary.empty();
for (int from = 0; from < entities.size(); from += subChunkSize) {
int to = Math.min(from + subChunkSize, entities.size());
List<AisTargetEntity> subChunk = entities.subList(from, to);
PublishSummary chunkSummary = publishSubChunk(subChunk);
totalSummary.merge(chunkSummary);
log.info("AIS Kafka 서브청크 전송 완료 - topic: {}, 범위: {}~{}, 요청: {}, 성공: {}, 실패: {}, 스킵: {}",
getTopic(), from, to - 1,
chunkSummary.getRequestedCount(),
chunkSummary.getSuccessCount(),
chunkSummary.getFailedCount(),
chunkSummary.getSkippedCount());
}
if (kafkaProperties.isFailOnSendError() && totalSummary.getFailedCount() > 0) {
throw new IllegalStateException("AIS Kafka 전송 실패 건수: " + totalSummary.getFailedCount());
}
return totalSummary;
}
private PublishSummary publishSubChunk(List<AisTargetEntity> subChunk) {
AtomicInteger successCount = new AtomicInteger(0);
AtomicInteger failedCount = new AtomicInteger(0);
AtomicInteger skippedCount = new AtomicInteger(0);
AtomicInteger sampledErrorLogs = new AtomicInteger(0);
List<CompletableFuture<Void>> futures = new ArrayList<>(subChunk.size());
for (AisTargetEntity entity : subChunk) {
if (!isValid(entity)) {
skippedCount.incrementAndGet();
continue;
}
try {
String key = String.valueOf(entity.getMmsi());
String payload = objectMapper.writeValueAsString(AisTargetKafkaMessage.from(entity));
CompletableFuture<Void> trackedFuture = kafkaTemplate.send(getTopic(), key, payload)
.handle((result, ex) -> {
if (ex != null) {
failedCount.incrementAndGet();
logSendError(sampledErrorLogs,
"AIS Kafka 전송 실패 - topic: " + getTopic()
+ ", key: " + key
+ ", messageTimestamp: " + entity.getMessageTimestamp()
+ ", error: " + ex.getMessage());
} else {
successCount.incrementAndGet();
}
return null;
});
futures.add(trackedFuture);
} catch (JsonProcessingException e) {
failedCount.incrementAndGet();
logSendError(sampledErrorLogs,
"AIS Kafka 메시지 직렬화 실패 - mmsi: " + entity.getMmsi()
+ ", messageTimestamp: " + entity.getMessageTimestamp()
+ ", error: " + e.getMessage());
} catch (Exception e) {
failedCount.incrementAndGet();
logSendError(sampledErrorLogs,
"AIS Kafka 전송 요청 실패 - mmsi: " + entity.getMmsi()
+ ", messageTimestamp: " + entity.getMessageTimestamp()
+ ", error: " + e.getMessage());
}
}
if (!futures.isEmpty()) {
CompletableFuture.allOf(futures.toArray(new CompletableFuture[0])).join();
kafkaTemplate.flush();
}
return PublishSummary.of(
false,
subChunk.size(),
successCount.get(),
failedCount.get(),
skippedCount.get()
);
}
private boolean isValid(AisTargetEntity entity) {
return entity != null
&& entity.getMmsi() != null
&& entity.getMessageTimestamp() != null;
}
private void logSendError(AtomicInteger sampledErrorLogs, String message) {
int current = sampledErrorLogs.incrementAndGet();
if (current <= 5) {
log.error(message);
return;
}
if (current == 6) {
log.error("AIS Kafka 전송 오류 로그가 많아 이후 상세 로그는 생략합니다.");
}
}
@Getter
public static class PublishSummary {
private final boolean disabled;
private int requestedCount;
private int successCount;
private int failedCount;
private int skippedCount;
private PublishSummary(
boolean disabled,
int requestedCount,
int successCount,
int failedCount,
int skippedCount
) {
this.disabled = disabled;
this.requestedCount = requestedCount;
this.successCount = successCount;
this.failedCount = failedCount;
this.skippedCount = skippedCount;
}
public static PublishSummary disabled() {
return of(true, 0, 0, 0, 0);
}
public static PublishSummary empty() {
return of(false, 0, 0, 0, 0);
}
public static PublishSummary of(
boolean disabled,
int requestedCount,
int successCount,
int failedCount,
int skippedCount
) {
return new PublishSummary(disabled, requestedCount, successCount, failedCount, skippedCount);
}
public void merge(PublishSummary other) {
this.requestedCount += other.requestedCount;
this.successCount += other.successCount;
this.failedCount += other.failedCount;
this.skippedCount += other.skippedCount;
}
}
}

파일 보기

@ -0,0 +1,36 @@
package com.snp.batch.jobs.aistarget.kafka;
import lombok.Getter;
import lombok.Setter;
import org.springframework.boot.context.properties.ConfigurationProperties;
/**
* AIS Target Kafka 전송 설정
*/
@Getter
@Setter
@ConfigurationProperties(prefix = "app.batch.ais-target.kafka")
public class AisTargetKafkaProperties {
/**
* Kafka 전송 활성화 여부
*/
private boolean enabled = true;
/**
* 전송 대상 토픽
*/
private String topic = "tp_SNP_AIS_Signal";
/**
* Kafka 전송 서브청크 크기
* 수집 청크(: 5만) 별도로 전송 배치를 분할한다.
*/
private int sendChunkSize = 5000;
/**
* 전송 실패 Step 실패 여부
* false면 실패 로그만 남기고 다음 처리를 계속한다.
*/
private boolean failOnSendError = false;
}

파일 보기

@ -7,9 +7,12 @@ import com.snp.batch.jobs.common.batch.processor.FlagCodeDataProcessor;
import com.snp.batch.jobs.common.batch.reader.FlagCodeDataReader;
import com.snp.batch.jobs.common.batch.repository.FlagCodeRepository;
import com.snp.batch.jobs.common.batch.writer.FlagCodeDataWriter;
import com.snp.batch.jobs.facility.batch.reader.PortDataReader;
import com.snp.batch.service.BatchApiLogService;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
@ -25,8 +28,14 @@ import org.springframework.web.reactive.function.client.WebClient;
@Configuration
public class FlagCodeImportJobConfig extends BaseJobConfig<FlagCodeDto, FlagCodeEntity> {
private final FlagCodeDataProcessor flagCodeDataProcessor;
private final FlagCodeDataReader flagCodeDataReader;
private final FlagCodeRepository flagCodeRepository;
private final WebClient maritimeApiWebClient;
private final BatchApiLogService batchApiLogService;
@Value("${app.batch.ship-api.url}")
private String maritimeApiUrl;
@Value("${app.batch.chunk-size:1000}")
private int chunkSize;
@ -39,10 +48,16 @@ public class FlagCodeImportJobConfig extends BaseJobConfig<FlagCodeDto, FlagCode
JobRepository jobRepository,
PlatformTransactionManager transactionManager,
FlagCodeRepository flagCodeRepository,
@Qualifier("maritimeApiWebClient") WebClient maritimeApiWebClient) {
FlagCodeDataProcessor flagCodeDataProcessor,
@Qualifier("maritimeApiWebClient") WebClient maritimeApiWebClient,
FlagCodeDataReader flagCodeDataReader,
BatchApiLogService batchApiLogService) {
super(jobRepository, transactionManager);
this.flagCodeRepository = flagCodeRepository;
this.maritimeApiWebClient = maritimeApiWebClient;
this.flagCodeDataProcessor = flagCodeDataProcessor;
this.flagCodeDataReader = flagCodeDataReader;
this.batchApiLogService = batchApiLogService;
}
@Override
@ -57,14 +72,29 @@ public class FlagCodeImportJobConfig extends BaseJobConfig<FlagCodeDto, FlagCode
@Override
protected ItemReader<FlagCodeDto> createReader() {
return new FlagCodeDataReader(maritimeApiWebClient);
return flagCodeDataReader;
}
@Bean
@StepScope
public FlagCodeDataReader flagCodeDataReader(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId, // SpEL로 ID 추출
@Value("#{stepExecution.id}") Long stepExecutionId
) {
FlagCodeDataReader reader = new FlagCodeDataReader(maritimeApiWebClient, batchApiLogService, maritimeApiUrl);
reader.setExecutionIds(jobExecutionId, stepExecutionId); // ID 세팅
return reader;
}
@Override
protected ItemProcessor<FlagCodeDto, FlagCodeEntity> createProcessor() {
return new FlagCodeDataProcessor(flagCodeRepository);
return flagCodeDataProcessor;
}
@Bean
@StepScope
public FlagCodeDataProcessor flagCodeDataProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
return new FlagCodeDataProcessor(jobExecutionId);
}
@Override
protected ItemWriter<FlagCodeEntity> createWriter() {
return new FlagCodeDataWriter(flagCodeRepository);

파일 보기

@ -7,14 +7,17 @@ import com.snp.batch.jobs.common.batch.processor.Stat5CodeDataProcessor;
import com.snp.batch.jobs.common.batch.reader.Stat5CodeDataReader;
import com.snp.batch.jobs.common.batch.repository.Stat5CodeRepository;
import com.snp.batch.jobs.common.batch.writer.Stat5CodeDataWriter;
import com.snp.batch.service.BatchApiLogService;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.transaction.PlatformTransactionManager;
@ -24,16 +27,27 @@ import org.springframework.web.reactive.function.client.WebClient;
@Configuration
public class Stat5CodeImportJobConfig extends BaseJobConfig<Stat5CodeDto, Stat5CodeEntity> {
private final Stat5CodeDataProcessor stat5CodeDataProcessor;
private final Stat5CodeDataReader stat5CodeDataReader;
private final Stat5CodeRepository stat5CodeRepository;
private final BatchApiLogService batchApiLogService;
private final WebClient maritimeAisApiWebClient;
@Value("${app.batch.ais-api.url}")
private String maritimeAisApiUrl;
public Stat5CodeImportJobConfig(
JobRepository jobRepository,
PlatformTransactionManager transactionManager,
Stat5CodeRepository stat5CodeRepository,
@Qualifier("maritimeAisApiWebClient") WebClient maritimeAisApiWebClient) {
@Qualifier("maritimeAisApiWebClient") WebClient maritimeAisApiWebClient,
Stat5CodeDataProcessor stat5CodeDataProcessor,
Stat5CodeDataReader stat5CodeDataReader,
BatchApiLogService batchApiLogService) {
super(jobRepository, transactionManager);
this.stat5CodeRepository = stat5CodeRepository;
this.maritimeAisApiWebClient = maritimeAisApiWebClient;
this.stat5CodeDataProcessor = stat5CodeDataProcessor;
this.stat5CodeDataReader = stat5CodeDataReader;
this.batchApiLogService = batchApiLogService;
}
@Override
@ -45,11 +59,27 @@ public class Stat5CodeImportJobConfig extends BaseJobConfig<Stat5CodeDto, Stat5C
}
@Override
protected ItemReader<Stat5CodeDto> createReader() { return new Stat5CodeDataReader(maritimeAisApiWebClient); }
protected ItemReader<Stat5CodeDto> createReader() { return stat5CodeDataReader; }
@Bean
@StepScope
public Stat5CodeDataReader stat5CodeDataReader(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId, // SpEL로 ID 추출
@Value("#{stepExecution.id}") Long stepExecutionId
) {
Stat5CodeDataReader reader = new Stat5CodeDataReader(maritimeAisApiWebClient, batchApiLogService, maritimeAisApiUrl);
reader.setExecutionIds(jobExecutionId, stepExecutionId); // ID 세팅
return reader;
}
@Override
protected ItemProcessor<Stat5CodeDto, Stat5CodeEntity> createProcessor() { return new Stat5CodeDataProcessor(stat5CodeRepository); }
protected ItemProcessor<Stat5CodeDto, Stat5CodeEntity> createProcessor() { return stat5CodeDataProcessor; }
@Bean
@StepScope
public Stat5CodeDataProcessor stat5CodeDataProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
return new Stat5CodeDataProcessor(jobExecutionId);
}
@Override
protected ItemWriter<Stat5CodeEntity> createWriter() { return new Stat5CodeDataWriter(stat5CodeRepository); }

파일 보기

@ -3,16 +3,16 @@ package com.snp.batch.jobs.common.batch.processor;
import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.common.batch.dto.FlagCodeDto;
import com.snp.batch.jobs.common.batch.entity.FlagCodeEntity;
import com.snp.batch.jobs.common.batch.repository.FlagCodeRepository;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
@Slf4j
public class FlagCodeDataProcessor extends BaseProcessor<FlagCodeDto, FlagCodeEntity> {
private final FlagCodeRepository commonCodeRepository;
private final Long jobExecutionId;
public FlagCodeDataProcessor(FlagCodeRepository commonCodeRepository) {
this.commonCodeRepository = commonCodeRepository;
public FlagCodeDataProcessor(@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
this.jobExecutionId = jobExecutionId;
}
@Override
@ -24,6 +24,8 @@ public class FlagCodeDataProcessor extends BaseProcessor<FlagCodeDto, FlagCodeEn
.decode(dto.getDecode())
.iso2(dto.getIso2())
.iso3(dto.getIso3())
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
log.debug("국가코드 데이터 처리 완료: FlagCode={}", dto.getCode());

파일 보기

@ -3,16 +3,16 @@ package com.snp.batch.jobs.common.batch.processor;
import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.common.batch.dto.Stat5CodeDto;
import com.snp.batch.jobs.common.batch.entity.Stat5CodeEntity;
import com.snp.batch.jobs.common.batch.repository.Stat5CodeRepository;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
@Slf4j
public class Stat5CodeDataProcessor extends BaseProcessor<Stat5CodeDto, Stat5CodeEntity> {
private final Stat5CodeRepository stat5CodeRepository;
private final Long jobExecutionId;
public Stat5CodeDataProcessor(Stat5CodeRepository stat5CodeRepository) {
this.stat5CodeRepository = stat5CodeRepository;
public Stat5CodeDataProcessor(@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
this.jobExecutionId = jobExecutionId;
}
@Override
@ -30,6 +30,8 @@ public class Stat5CodeDataProcessor extends BaseProcessor<Stat5CodeDto, Stat5Cod
.Level5Decode(dto.getLevel5Decode())
.Description(dto.getDescription())
.Release(Integer.toString(dto.getRelease()))
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
log.debug("Stat5Code 데이터 처리 완료: Stat5Code={}", dto.getLevel5());

파일 보기

@ -3,17 +3,22 @@ package com.snp.batch.jobs.common.batch.reader;
import com.snp.batch.common.batch.reader.BaseApiReader;
import com.snp.batch.jobs.common.batch.dto.FlagCodeApiResponse;
import com.snp.batch.jobs.common.batch.dto.FlagCodeDto;
import com.snp.batch.service.BatchApiLogService;
import lombok.extern.slf4j.Slf4j;
import org.springframework.web.reactive.function.client.WebClient;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
@Slf4j
public class FlagCodeDataReader extends BaseApiReader<FlagCodeDto> {
public FlagCodeDataReader(WebClient webClient) {
private final BatchApiLogService batchApiLogService;
String maritimeApiUrl;
public FlagCodeDataReader(WebClient webClient, BatchApiLogService batchApiLogService, String maritimeApiUrl) {
super(webClient); // BaseApiReader에 WebClient 전달
this.batchApiLogService = batchApiLogService;
this.maritimeApiUrl = maritimeApiUrl;
}
// ========================================
@ -24,28 +29,25 @@ public class FlagCodeDataReader extends BaseApiReader<FlagCodeDto> {
protected String getReaderName() {
return "FlagCodeDataReader";
}
@Override
protected String getApiPath() {
return "/MaritimeWCF/APSShipService.svc/RESTFul/GetAssociatedFlagISOByName";
}
@Override
protected List<FlagCodeDto> fetchDataFromApi() {
try {
log.info("GetAssociatedFlagISOByName API 호출 시작");
FlagCodeApiResponse response = webClient
.get()
.uri(uriBuilder -> uriBuilder
.path("/MaritimeWCF/APSShipService.svc/RESTFul/GetAssociatedFlagISOByName")
.build())
.retrieve()
.bodyToMono(FlagCodeApiResponse.class)
.block();
if (response != null && response.getAssociatedFlagISODetails() != null) {
log.info("API 응답 성공: 총 {} 건의 국가코드 데이터 수신", response.getAssociatedCount());
return response.getAssociatedFlagISODetails();
} else {
log.warn("API 응답이 null이거나 국가코드 데이터가 없습니다");
return new ArrayList<>();
}
// 공통 함수 호출
List<FlagCodeDto> result = executeWrapperApiCall(
maritimeApiUrl, // 베이스 URL (필드 또는 설정값)
getApiPath(), // API 경로
FlagCodeApiResponse.class, // 응답을 받을 래퍼 클래스
FlagCodeApiResponse::getAssociatedFlagISODetails, // 리스트 추출 함수 (메서드 참조)
batchApiLogService // 로그 서비스
);
// 결과가 null일 경우 리스트 반환 (안전장치)
return result != null ? result : Collections.emptyList();
} catch (Exception e) {
log.error("GetAssociatedFlagISOByName API 호출 실패", e);

파일 보기

@ -3,43 +3,46 @@ package com.snp.batch.jobs.common.batch.reader;
import com.snp.batch.common.batch.reader.BaseApiReader;
import com.snp.batch.jobs.common.batch.dto.Stat5CodeApiResponse;
import com.snp.batch.jobs.common.batch.dto.Stat5CodeDto;
import com.snp.batch.service.BatchApiLogService;
import lombok.extern.slf4j.Slf4j;
import org.springframework.web.reactive.function.client.WebClient;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
@Slf4j
public class Stat5CodeDataReader extends BaseApiReader<Stat5CodeDto> {
public Stat5CodeDataReader(WebClient webClient) {
private final BatchApiLogService batchApiLogService;
String maritimeAisApiUrl;
public Stat5CodeDataReader(WebClient webClient, BatchApiLogService batchApiLogService, String maritimeAisApiUrl) {
super(webClient); // BaseApiReader에 WebClient 전달
this.batchApiLogService = batchApiLogService;
this.maritimeAisApiUrl = maritimeAisApiUrl;
}
@Override
protected String getReaderName() {
return "Stat5CodeDataReader";
}
@Override
protected String getApiPath() {
return "/AisSvc.svc/AIS/GetStatcodes";
}
@Override
protected List<Stat5CodeDto> fetchDataFromApi() {
try {
log.info("GetStatcodes API 호출 시작");
Stat5CodeApiResponse response = webClient
.get()
.uri(uriBuilder -> uriBuilder
.path("/AisSvc.svc/AIS/GetStatcodes")
.build())
.retrieve()
.bodyToMono(Stat5CodeApiResponse.class)
.block();
if (response != null && response.getStatcodeArr() != null) {
log.info("API 응답 성공: 총 {} 건의 Stat5Code 데이터 수신", response.getStatcodeArr().size());
return response.getStatcodeArr();
} else {
log.warn("API 응답이 null이거나 Stat5Code 데이터가 없습니다");
return new ArrayList<>();
}
// 공통 함수 호출
List<Stat5CodeDto> result = executeWrapperApiCall(
maritimeAisApiUrl, // 베이스 URL (필드 또는 설정값)
getApiPath(), // API 경로
Stat5CodeApiResponse.class, // 응답을 받을 래퍼 클래스
Stat5CodeApiResponse::getStatcodeArr, // 리스트 추출 함수 (메서드 참조)
batchApiLogService // 로그 서비스
);
// 결과가 null일 경우 리스트 반환 (안전장치)
return result != null ? result : Collections.emptyList();
} catch (Exception e) {
log.error("GetAssociatedFlagISOByName API 호출 실패", e);
log.error("에러 메시지: {}", e.getMessage());

파일 보기

@ -3,29 +3,42 @@ package com.snp.batch.jobs.common.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.common.batch.entity.FlagCodeEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.Types;
import java.util.List;
@Slf4j
@Repository("FlagCodeRepository")
public class FlagCodeRepositoryImpl extends BaseJdbcRepository<FlagCodeEntity, String> implements FlagCodeRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.code-002}")
private String tableName;
public FlagCodeRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getEntityName() {
return "FlagCodeEntity";
}
@Override
protected String getTableName() {
return "snp_data.flagcode";
protected String getSimpleTableName() {
return tableName;
}
@ -37,17 +50,11 @@ public class FlagCodeRepositoryImpl extends BaseJdbcRepository<FlagCodeEntity, S
@Override
protected String getUpdateSql() {
return """
INSERT INTO snp_data.flagcode (
datasetversion, code, decode, iso2, iso3
) VALUES (?, ?, ?, ?, ?)
ON CONFLICT (code)
DO UPDATE SET
datasetversion = EXCLUDED.datasetversion,
decode = EXCLUDED.decode,
iso2 = EXCLUDED.iso2,
iso3 = EXCLUDED.iso3,
batch_flag = 'N'
""";
INSERT INTO %s(
dataset_ver, ship_country_cd, cd_nm, iso_two_cd, iso_thr_cd,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}
@Override
@ -63,6 +70,8 @@ public class FlagCodeRepositoryImpl extends BaseJdbcRepository<FlagCodeEntity, S
ps.setString(idx++, entity.getDecode());
ps.setString(idx++, entity.getIso2());
ps.setString(idx++, entity.getIso3());
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
@Override

파일 보기

@ -3,28 +3,42 @@ package com.snp.batch.jobs.common.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.common.batch.entity.Stat5CodeEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.Types;
import java.util.List;
@Slf4j
@Repository("Stat5CodeRepository")
public class Stat5CodeRepositoryImpl extends BaseJdbcRepository<Stat5CodeEntity, String> implements Stat5CodeRepository{
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.code-001}")
private String tableName;
public Stat5CodeRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getEntityName() {
return "Stat5CodeEntity";
}
@Override
protected String getTableName() {
return "snp_data.stat5code";
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -45,25 +59,11 @@ public class Stat5CodeRepositoryImpl extends BaseJdbcRepository<Stat5CodeEntity,
@Override
protected String getUpdateSql() {
return """
INSERT INTO snp_data.stat5code (
level1, level1decode, level2, level2decode, level3, level3decode, level4, level4decode, level5, level5decode, description, release
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (level1, level2, level3, level4, level5)
DO UPDATE SET
level1 = EXCLUDED.level1,
level1decode = EXCLUDED.level1decode,
level2 = EXCLUDED.level2,
level2decode = EXCLUDED.level2decode,
level3 = EXCLUDED.level3,
level3decode = EXCLUDED.level3decode,
level4 = EXCLUDED.level4,
level4decode = EXCLUDED.level4decode,
level5 = EXCLUDED.level5,
level5decode = EXCLUDED.level5decode,
description = EXCLUDED.description,
release = EXCLUDED.release,
batch_flag = 'N'
""";
INSERT INTO %s(
lv_one, lv_one_desc, lv_two, lv_two_desc, lv_thr, lv_thr_desc, lv_four, lv_four_desc, lv_five, lv_five_desc, dtl_desc, rls_iem,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}
@Override
@ -86,6 +86,8 @@ public class Stat5CodeRepositoryImpl extends BaseJdbcRepository<Stat5CodeEntity,
ps.setString(idx++, entity.getLevel5Decode());
ps.setString(idx++, entity.getDescription());
ps.setString(idx++, entity.getRelease());
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
@Override

파일 보기

@ -2,12 +2,9 @@ package com.snp.batch.jobs.compliance.batch.config;
import com.snp.batch.common.batch.config.BaseMultiStepJobConfig;
import com.snp.batch.jobs.compliance.batch.dto.CompanyComplianceDto;
import com.snp.batch.jobs.compliance.batch.dto.ComplianceDto;
import com.snp.batch.jobs.compliance.batch.entity.CompanyComplianceEntity;
import com.snp.batch.jobs.compliance.batch.entity.ComplianceEntity;
import com.snp.batch.jobs.compliance.batch.processor.CompanyComplianceDataProcessor;
import com.snp.batch.jobs.compliance.batch.reader.CompanyComplianceDataRangeReader;
import com.snp.batch.jobs.compliance.batch.reader.ComplianceDataRangeReader;
import com.snp.batch.jobs.compliance.batch.writer.CompanyComplianceDataWriter;
import com.snp.batch.service.BatchApiLogService;
import com.snp.batch.service.BatchDateService;
@ -50,9 +47,12 @@ public class CompanyComplianceImportRangeJobConfig extends BaseMultiStepJobConfi
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "COMPANY_COMPLIANCE_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
@Override
protected int getChunkSize() {
@ -91,7 +91,7 @@ public class CompanyComplianceImportRangeJobConfig extends BaseMultiStepJobConfi
protected Job createJobFlow(JobBuilder jobBuilder) {
return jobBuilder
.start(companyComplianceImportRangeStep()) // 1단계 실행
.next(companyComplianceHistoryValueChangeManageStep()) // 2단계 실행 (2단계 실패 실행 )
// .next(companyComplianceHistoryValueChangeManageStep()) // 2단계 실행 (2단계 실패 실행 )
.next(companyComplianceLastExecutionUpdateStep()) // 3단계: 모두 완료 , BATCH_LAST_EXECUTION 마지막 성공일자 업데이트
.build();
}
@ -115,7 +115,12 @@ public class CompanyComplianceImportRangeJobConfig extends BaseMultiStepJobConfi
protected ItemProcessor<CompanyComplianceDto, CompanyComplianceEntity> createProcessor() {
return companyComplianceDataProcessor;
}
@Bean
@StepScope
public CompanyComplianceDataProcessor companyComplianceDataProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
return new CompanyComplianceDataProcessor(jobExecutionId);
}
@Override
protected ItemWriter<CompanyComplianceEntity> createWriter() {
return companyComplianceDataWriter;
@ -155,7 +160,8 @@ public class CompanyComplianceImportRangeJobConfig extends BaseMultiStepJobConfi
log.info("Company Compliance History Value Change Manage 프로시저 변수 (KST 변환): 시작일: {}, 종료일: {}", startDt, endDt);
// 3. 프로시저 호출 (안전한 파라미터 바인딩 권장)
jdbcTemplate.update("CALL new_snp.company_compliance_history_value_change_manage(CAST(? AS TIMESTAMP), CAST(? AS TIMESTAMP))", startDt, endDt);
String procedureCall = String.format("CALL %s.company_compliance_history_value_change_manage(CAST(? AS TIMESTAMP), CAST(? AS TIMESTAMP))", targetSchema);
jdbcTemplate.update(procedureCall, startDt, endDt);
log.info(">>>>> Company Compliance History Value Change Manage 프로시저 호출 완료");
return RepeatStatus.FINISHED;

파일 보기

@ -14,6 +14,7 @@ import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
@ -30,6 +31,9 @@ public class ComplianceImportJobConfig extends BaseJobConfig<ComplianceDto, Comp
private final ComplianceDataWriter complianceDataWriter;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Override
protected int getChunkSize() {
return 5000; // API에서 5000개씩 가져오므로 chunk도 5000으로 설정
@ -60,7 +64,7 @@ public class ComplianceImportJobConfig extends BaseJobConfig<ComplianceDto, Comp
@Override
protected ItemReader<ComplianceDto> createReader() {
return new ComplianceDataReader(maritimeServiceApiWebClient, jdbcTemplate);
return new ComplianceDataReader(maritimeServiceApiWebClient, jdbcTemplate, targetSchema);
}
@Override

파일 보기

@ -28,7 +28,8 @@ import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.web.reactive.function.client.WebClient;
import java.time.*;
import java.time.OffsetDateTime;
import java.time.ZoneId;
import java.time.format.DateTimeFormatter;
import java.util.Map;
@ -45,9 +46,12 @@ public class ComplianceImportRangeJobConfig extends BaseMultiStepJobConfig<Compl
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "COMPLIANCE_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
@Override
protected int getChunkSize() {
@ -87,7 +91,7 @@ public class ComplianceImportRangeJobConfig extends BaseMultiStepJobConfig<Compl
protected Job createJobFlow(JobBuilder jobBuilder) {
return jobBuilder
.start(complianceImportRangeStep()) // 1단계 실행
.next(complianceHistoryValueChangeManageStep()) // 2단계 실행 (2단계 실패 실행 )
// .next(complianceHistoryValueChangeManageStep()) // 2단계 실행 (2단계 실패 실행 )
.next(complianceLastExecutionUpdateStep()) // 3단계: 모두 완료 , BATCH_LAST_EXECUTION 마지막 성공일자 업데이트
.build();
}
@ -112,6 +116,13 @@ public class ComplianceImportRangeJobConfig extends BaseMultiStepJobConfig<Compl
return complianceDataProcessor;
}
@Bean
@StepScope
public ComplianceDataProcessor complianceDataProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
return new ComplianceDataProcessor(jobExecutionId);
}
@Override
protected ItemWriter<ComplianceEntity> createWriter() {
return complianceDataWriter;
@ -151,7 +162,8 @@ public class ComplianceImportRangeJobConfig extends BaseMultiStepJobConfig<Compl
log.info("Compliance History Value Change Manage 프로시저 변수 (KST 변환): 시작일: {}, 종료일: {}", startDt, endDt);
// 3. 프로시저 호출 (안전한 파라미터 바인딩 권장)
jdbcTemplate.update("CALL new_snp.compliance_history_value_change_manage(CAST(? AS TIMESTAMP), CAST(? AS TIMESTAMP))", startDt, endDt);
String procedureCall = String.format("CALL %s.compliance_history_value_change_manage(CAST(? AS TIMESTAMP), CAST(? AS TIMESTAMP))", targetSchema);
jdbcTemplate.update(procedureCall, startDt, endDt);
log.info(">>>>> Compliance History Value Change Manage 프로시저 호출 완료");
return RepeatStatus.FINISHED;

파일 보기

@ -4,12 +4,16 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.compliance.batch.dto.CompanyComplianceDto;
import com.snp.batch.jobs.compliance.batch.entity.CompanyComplianceEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
@Slf4j
@Component
public class CompanyComplianceDataProcessor extends BaseProcessor<CompanyComplianceDto, CompanyComplianceEntity> {
private final Long jobExecutionId;
public CompanyComplianceDataProcessor(@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
this.jobExecutionId = jobExecutionId;
}
@Override
protected CompanyComplianceEntity processItem(CompanyComplianceDto dto) throws Exception {
@ -30,6 +34,8 @@ public class CompanyComplianceDataProcessor extends BaseProcessor<CompanyComplia
.companyOnSwissSanctionList(dto.getCompanyOnSwissSanctionList())
.companyOnUAESanctionList(dto.getCompanyOnUAESanctionList())
.companyOnUNSanctionList(dto.getCompanyOnUNSanctionList())
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;
}

파일 보기

@ -4,11 +4,16 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.compliance.batch.dto.ComplianceDto;
import com.snp.batch.jobs.compliance.batch.entity.ComplianceEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
@Slf4j
@Component
public class ComplianceDataProcessor extends BaseProcessor<ComplianceDto, ComplianceEntity> {
private final Long jobExecutionId;
public ComplianceDataProcessor(@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
this.jobExecutionId = jobExecutionId;
}
@Override
protected ComplianceEntity processItem(ComplianceDto dto) throws Exception {
@ -50,6 +55,8 @@ public class ComplianceDataProcessor extends BaseProcessor<ComplianceDto, Compli
.shipSTSPartnerNonComplianceLast12m(dto.getShipSTSPartnerNonComplianceLast12m())
.shipSwissSanctionList(dto.getShipSwissSanctionList())
.shipUNSanctionList(dto.getShipUNSanctionList())
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;

파일 보기

@ -19,14 +19,16 @@ public class ComplianceDataReader extends BaseApiReader<ComplianceDto> {
// 3. Response Data -> Core20에 업데이트 (Chunk 단위로 반복)
private final JdbcTemplate jdbcTemplate;
private final String targetSchema;
private List<String> allImoNumbers;
private int currentBatchIndex = 0;
private final int batchSize = 100;
public ComplianceDataReader(WebClient webClient, JdbcTemplate jdbcTemplate) {
public ComplianceDataReader(WebClient webClient, JdbcTemplate jdbcTemplate, String targetSchema) {
super(webClient);
this.jdbcTemplate = jdbcTemplate;
this.targetSchema = targetSchema;
enableChunkMode(); // Chunk 모드 활성화
}
@ -47,16 +49,17 @@ public class ComplianceDataReader extends BaseApiReader<ComplianceDto> {
}
private String getTargetTable(){
return "snp_data.core20";
return targetSchema + ".ship_data";
}
private String getImoQuery() {
return "select imo_number as ihslrorimoshipno from " + getTargetTable() + " order by imo_number";
}
private String GET_CORE_IMO_LIST =
// "SELECT ihslrorimoshipno FROM " + getTargetTable() + " ORDER BY ihslrorimoshipno";
"select imo_number as ihslrorimoshipno from snp_data.ship_data order by imo_number";
@Override
protected void beforeFetch(){
log.info("[{}] Core20 테이블에서 IMO 번호 조회 시작...", getReaderName());
allImoNumbers = jdbcTemplate.queryForList(GET_CORE_IMO_LIST, String.class);
allImoNumbers = jdbcTemplate.queryForList(getImoQuery(), String.class);
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);

파일 보기

@ -6,5 +6,5 @@ import java.util.List;
public interface CompanyComplianceRepository {
void saveCompanyComplianceAll(List<CompanyComplianceEntity> items);
void saveCompanyComplianceHistoryAll(List<CompanyComplianceEntity> items);
// void saveCompanyComplianceHistoryAll(List<CompanyComplianceEntity> items);
}

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.compliance.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.compliance.batch.entity.CompanyComplianceEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -14,13 +15,25 @@ import java.util.List;
@Slf4j
@Repository("CompanyComplianceRepository")
public class CompanyComplianceRepositoryImpl extends BaseJdbcRepository<CompanyComplianceEntity, Long> implements CompanyComplianceRepository{
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.risk-compliance-003}")
private String tableName;
public CompanyComplianceRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return null;
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -40,35 +53,18 @@ public class CompanyComplianceRepositoryImpl extends BaseJdbcRepository<CompanyC
@Override
protected String getUpdateSql() {
return null;
}
protected String getUpdateSql(String targetTable, String targetIndex) {
return """
INSERT INTO new_snp.%s(
owcode, lastupdated,
companyoverallcompliancestatus, companyonaustraliansanctionlist, companyonbessanctionlist, companyoncanadiansanctionlist, companyinofacsanctionedcountry,
companyinfatfjurisdiction, companyoneusanctionlist, companyonofacsanctionlist, companyonofacnonsdnsanctionlist, companyonofacssilist,
companyonswisssanctionlist, companyonuaesanctionlist, companyonunsanctionlist, parentcompanycompliancerisk
INSERT INTO %s(
company_cd, lst_mdfcn_dt,
company_snths_compliance_status, company_aus_sanction_list, company_bes_sanction_list, company_can_sanction_list, company_ofac_sanction_country,
company_fatf_cmptnc_country, company_eu_sanction_list, company_ofac_sanction_list, company_ofac_non_sdn_sanction_list, company_ofacssi_sanction_list,
company_swiss_sanction_list, company_uae_sanction_list, company_un_sanction_list, prnt_company_compliance_risk,
job_execution_id, creatr_id
)VALUES(
?, ?::timestamp, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?
)ON CONFLICT (%s)
DO UPDATE SET
companyoverallcompliancestatus = EXCLUDED.companyoverallcompliancestatus,
companyonaustraliansanctionlist = EXCLUDED.companyonaustraliansanctionlist,
companyonbessanctionlist = EXCLUDED.companyonbessanctionlist,
companyoncanadiansanctionlist = EXCLUDED.companyoncanadiansanctionlist,
companyinofacsanctionedcountry = EXCLUDED.companyinofacsanctionedcountry,
companyinfatfjurisdiction = EXCLUDED.companyinfatfjurisdiction,
companyoneusanctionlist = EXCLUDED.companyoneusanctionlist,
companyonofacsanctionlist = EXCLUDED.companyonofacsanctionlist,
companyonofacnonsdnsanctionlist = EXCLUDED.companyonofacnonsdnsanctionlist,
companyonofacssilist = EXCLUDED.companyonofacssilist,
companyonswisssanctionlist = EXCLUDED.companyonswisssanctionlist,
companyonuaesanctionlist = EXCLUDED.companyonuaesanctionlist,
companyonunsanctionlist = EXCLUDED.companyonunsanctionlist,
parentcompanycompliancerisk = EXCLUDED.parentcompanycompliancerisk
""".formatted(targetTable, targetIndex);
?, ?::timestamp, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?
);
""".formatted(getTableName());
}
@Override
@ -94,6 +90,8 @@ public class CompanyComplianceRepositoryImpl extends BaseJdbcRepository<CompanyC
ps.setObject(idx++, entity.getCompanyOnUAESanctionList(), Types.INTEGER);
ps.setObject(idx++, entity.getCompanyOnUNSanctionList(), Types.INTEGER);
ps.setObject(idx++, entity.getParentCompanyNonCompliance(), Types.INTEGER);
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
@Override
@ -106,7 +104,7 @@ public class CompanyComplianceRepositoryImpl extends BaseJdbcRepository<CompanyC
if (items == null || items.isEmpty()) {
return;
}
jdbcTemplate.batchUpdate(getUpdateSql("tb_company_compliance_info", "owcode"), items, items.size(),
jdbcTemplate.batchUpdate(getUpdateSql(), items, items.size(),
(ps, entity) -> {
try {
setUpdateParameters(ps, entity);
@ -118,20 +116,4 @@ public class CompanyComplianceRepositoryImpl extends BaseJdbcRepository<CompanyC
log.info("{} 전체 저장 완료: 수정={} 건", getEntityName(), items.size());
}
@Override
public void saveCompanyComplianceHistoryAll(List<CompanyComplianceEntity> items) {
if (items == null || items.isEmpty()) {
return;
}
jdbcTemplate.batchUpdate(getUpdateSql("tb_company_compliance_hstry", "owcode, lastupdated"), items, items.size(),
(ps, entity) -> {
try {
setUpdateParameters(ps, entity);
} catch (Exception e) {
log.error("배치 수정 파라미터 설정 실패", e);
throw new RuntimeException(e);
}
});
log.info("{} 전체 저장 완료: 수정={} 건", getEntityName(), items.size());
}
}

파일 보기

@ -6,5 +6,5 @@ import java.util.List;
public interface ComplianceRepository {
void saveComplianceAll(List<ComplianceEntity> items);
void saveComplianceHistoryAll(List<ComplianceEntity> items);
// void saveComplianceHistoryAll(List<ComplianceEntity> items);
}

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.compliance.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.compliance.batch.entity.ComplianceEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -15,13 +16,24 @@ import java.util.List;
@Repository("ComplianceRepository")
public class ComplianceRepositoryImpl extends BaseJdbcRepository<ComplianceEntity, Long> implements ComplianceRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.risk-compliance-002}")
private String tableName;
public ComplianceRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return null;
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -41,63 +53,26 @@ public class ComplianceRepositoryImpl extends BaseJdbcRepository<ComplianceEntit
@Override
protected String getUpdateSql() {
return null;
}
protected String getUpdateSql(String targetTable, String targetIndex) {
return """
INSERT INTO new_snp.%s (
lrimoshipno, dateamended, legaloverall, shipbessanctionlist, shipdarkactivityindicator,
shipdetailsnolongermaintained, shipeusanctionlist, shipflagdisputed, shipflagsanctionedcountry,
shiphistoricalflagsanctionedcountry, shipofacnonsdnsanctionlist, shipofacsanctionlist,
shipofacadvisorylist, shipownerofacssilist, shipowneraustraliansanctionlist, shipownerbessanctionlist,
shipownercanadiansanctionlist, shipownereusanctionlist, shipownerfatfjurisdiction,
shipownerhistoricalofacsanctionedcountry, shipownerofacsanctionlist, shipownerofacsanctionedcountry,
shipownerparentcompanynoncompliance, shipownerparentfatfjurisdiction, shipownerparentofacsanctionedcountry,
shipownerswisssanctionlist, shipowneruaesanctionlist, shipownerunsanctionlist,
shipsanctionedcountryportcalllast12m, shipsanctionedcountryportcalllast3m, shipsanctionedcountryportcalllast6m,
shipsecuritylegaldisputeevent, shipstspartnernoncompliancelast12m, shipswisssanctionlist,
shipunsanctionlist
INSERT INTO %s (
imo_no, last_mdfcn_dt, lgl_snths_sanction, ship_bes_sanction_list, ship_dark_actv_ind,
ship_dtld_info_ntmntd, ship_eu_sanction_list, ship_flg_dspt, ship_flg_sanction_country,
ship_flg_sanction_country_hstry, ship_ofac_non_sdn_sanction_list, ship_ofac_sanction_list,
ship_ofac_cutn_list, ship_ownr_ofcs_sanction_list, ship_ownr_aus_sanction_list, ship_ownr_bes_sanction_list,
ship_ownr_can_sanction_list, ship_ownr_eu_sanction_list, ship_ownr_fatf_rgl_zone,
ship_ownr_ofac_sanction_hstry, ship_ownr_ofac_sanction_list, ship_ownr_ofac_sanction_country,
ship_ownr_prnt_company_ncmplnc, ship_ownr_prnt_company_fatf_rgl_zone, ship_ownr_prnt_company_ofac_sanction_country,
ship_ownr_swi_sanction_list, ship_ownr_uae_sanction_list, ship_ownr_un_sanction_list,
ship_sanction_country_prtcll_last_twelve_m, ship_sanction_country_prtcll_last_thr_m, ship_sanction_country_prtcll_last_six_m,
ship_scrty_lgl_dspt_event, ship_sts_prtnr_non_compliance_twelve_m, ship_swi_sanction_list,
ship_un_sanction_list,
job_execution_id, creatr_id
)
VALUES (
?, ?::timestamptz, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?
)
ON CONFLICT (%s)
DO UPDATE SET
legaloverall = EXCLUDED.legaloverall,
shipbessanctionlist = EXCLUDED.shipbessanctionlist,
shipdarkactivityindicator = EXCLUDED.shipdarkactivityindicator,
shipdetailsnolongermaintained = EXCLUDED.shipdetailsnolongermaintained,
shipeusanctionlist = EXCLUDED.shipeusanctionlist,
shipflagdisputed = EXCLUDED.shipflagdisputed,
shipflagsanctionedcountry = EXCLUDED.shipflagsanctionedcountry,
shiphistoricalflagsanctionedcountry = EXCLUDED.shiphistoricalflagsanctionedcountry,
shipofacnonsdnsanctionlist = EXCLUDED.shipofacnonsdnsanctionlist,
shipofacsanctionlist = EXCLUDED.shipofacsanctionlist,
shipofacadvisorylist = EXCLUDED.shipofacadvisorylist,
shipownerofacssilist = EXCLUDED.shipownerofacssilist,
shipowneraustraliansanctionlist = EXCLUDED.shipowneraustraliansanctionlist,
shipownerbessanctionlist = EXCLUDED.shipownerbessanctionlist,
shipownercanadiansanctionlist = EXCLUDED.shipownercanadiansanctionlist,
shipownereusanctionlist = EXCLUDED.shipownereusanctionlist,
shipownerfatfjurisdiction = EXCLUDED.shipownerfatfjurisdiction,
shipownerhistoricalofacsanctionedcountry = EXCLUDED.shipownerhistoricalofacsanctionedcountry,
shipownerofacsanctionlist = EXCLUDED.shipownerofacsanctionlist,
shipownerofacsanctionedcountry = EXCLUDED.shipownerofacsanctionedcountry,
shipownerparentcompanynoncompliance = EXCLUDED.shipownerparentcompanynoncompliance,
shipownerparentfatfjurisdiction = EXCLUDED.shipownerparentfatfjurisdiction,
shipownerparentofacsanctionedcountry = EXCLUDED.shipownerparentofacsanctionedcountry,
shipownerswisssanctionlist = EXCLUDED.shipownerswisssanctionlist,
shipowneruaesanctionlist = EXCLUDED.shipowneruaesanctionlist,
shipownerunsanctionlist = EXCLUDED.shipownerunsanctionlist,
shipsanctionedcountryportcalllast12m = EXCLUDED.shipsanctionedcountryportcalllast12m,
shipsanctionedcountryportcalllast3m = EXCLUDED.shipsanctionedcountryportcalllast3m,
shipsanctionedcountryportcalllast6m = EXCLUDED.shipsanctionedcountryportcalllast6m,
shipsecuritylegaldisputeevent = EXCLUDED.shipsecuritylegaldisputeevent,
shipstspartnernoncompliancelast12m = EXCLUDED.shipstspartnernoncompliancelast12m,
shipswisssanctionlist = EXCLUDED.shipswisssanctionlist,
shipunsanctionlist = EXCLUDED.shipunsanctionlist
""".formatted(targetTable, targetIndex);
?, ?::timestamptz, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?
);
""".formatted(getTableName());
}
@Override
@ -143,6 +118,8 @@ public class ComplianceRepositoryImpl extends BaseJdbcRepository<ComplianceEntit
ps.setObject(idx++, entity.getShipSTSPartnerNonComplianceLast12m(), Types.INTEGER);
ps.setObject(idx++, entity.getShipSwissSanctionList(), Types.INTEGER);
ps.setObject(idx++, entity.getShipUNSanctionList(), Types.INTEGER);
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
@Override
@ -155,24 +132,7 @@ public class ComplianceRepositoryImpl extends BaseJdbcRepository<ComplianceEntit
if (items == null || items.isEmpty()) {
return;
}
jdbcTemplate.batchUpdate(getUpdateSql("compliance", "lrimoshipno"), items, items.size(),
(ps, entity) -> {
try {
setUpdateParameters(ps, entity);
} catch (Exception e) {
log.error("배치 수정 파라미터 설정 실패", e);
throw new RuntimeException(e);
}
});
log.info("{} 전체 저장 완료: 수정={} 건", getEntityName(), items.size());
}
@Override
public void saveComplianceHistoryAll(List<ComplianceEntity> items) {
if (items == null || items.isEmpty()) {
return;
}
jdbcTemplate.batchUpdate(getUpdateSql("compliance_history", "lrimoshipno, dateamended"), items, items.size(),
jdbcTemplate.batchUpdate(getUpdateSql(), items, items.size(),
(ps, entity) -> {
try {
setUpdateParameters(ps, entity);

파일 보기

@ -20,6 +20,6 @@ public class CompanyComplianceDataWriter extends BaseWriter<CompanyComplianceEnt
@Override
protected void writeItems(List<CompanyComplianceEntity> items) throws Exception {
complianceRepository.saveCompanyComplianceAll(items);
complianceRepository.saveCompanyComplianceHistoryAll(items);
// complianceRepository.saveCompanyComplianceHistoryAll(items);
}
}

파일 보기

@ -19,6 +19,6 @@ public class ComplianceDataWriter extends BaseWriter<ComplianceEntity> {
@Override
protected void writeItems(List<ComplianceEntity> items) throws Exception {
complianceRepository.saveComplianceAll(items);
complianceRepository.saveComplianceHistoryAll(items);
// complianceRepository.saveComplianceHistoryAll(items);
}
}

파일 보기

@ -42,9 +42,12 @@ public class EventImportJobConfig extends BaseMultiStepJobConfig<EventDetailDto,
@Value("${app.batch.ship-api.url}")
private String maritimeApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "EVENT_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
@Override
protected int getChunkSize() {
@ -109,6 +112,14 @@ public class EventImportJobConfig extends BaseMultiStepJobConfig<EventDetailDto,
return eventDataProcessor;
}
@Bean
@StepScope
public EventDataProcessor eventDataProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId
){
return new EventDataProcessor(jobExecutionId);
}
@Override
protected ItemWriter<EventDetailEntity> createWriter() { return eventDataWriter; }

파일 보기

@ -1,12 +1,17 @@
package com.snp.batch.jobs.event.batch.processor;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.event.batch.dto.CargoDto;
import com.snp.batch.jobs.event.batch.dto.EventDetailDto;
import com.snp.batch.jobs.event.batch.dto.HumanCasualtyDto;
import com.snp.batch.jobs.event.batch.dto.RelationshipDto;
import com.snp.batch.jobs.event.batch.entity.CargoEntity;
import com.snp.batch.jobs.event.batch.entity.EventDetailEntity;
import com.snp.batch.jobs.event.batch.entity.HumanCasualtyEntity;
import com.snp.batch.jobs.event.batch.entity.RelationshipEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.util.stream.Collectors;
@ -14,6 +19,12 @@ import java.util.stream.Collectors;
@Slf4j
@Component
public class EventDataProcessor extends BaseProcessor<EventDetailDto, EventDetailEntity> {
private static Long jobExecutionId;
public EventDataProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId
) {
this.jobExecutionId = jobExecutionId;
}
@Override
protected EventDetailEntity processItem(EventDetailDto dto) throws Exception {
log.debug("Event 데이터 처리 시작: Event ID = {}", dto.getEventID());
@ -61,12 +72,20 @@ public class EventDataProcessor extends BaseProcessor<EventDetailDto, EventDetai
.firedUpon(dto.getFiredUpon())
.eventStartDate(dto.getEventStartDate())
.eventEndDate(dto.getEventEndDate())
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.cargoes(dto.getCargoes() != null ?
dto.getCargoes().stream().map(CargoDto::toEntity).collect(Collectors.toList()) : null)
dto.getCargoes().stream()
.map(d -> (CargoEntity) d.toEntity().setBatchInfo(jobExecutionId, "SYSTEM"))
.collect(Collectors.toList()) : null)
.humanCasualties(dto.getHumanCasualties() != null ?
dto.getHumanCasualties().stream().map(HumanCasualtyDto::toEntity).collect(Collectors.toList()) : null)
dto.getHumanCasualties().stream()
.map(d -> (HumanCasualtyEntity) d.toEntity().setBatchInfo(jobExecutionId, "SYSTEM"))
.collect(Collectors.toList()) : null)
.relationships(dto.getRelationships() != null ?
dto.getRelationships().stream().map(RelationshipDto::toEntity).collect(Collectors.toList()) : null)
dto.getRelationships().stream()
.map(d -> (RelationshipEntity) d.toEntity().setBatchInfo(jobExecutionId, "SYSTEM"))
.collect(Collectors.toList()) : null)
.build();
log.debug("Event 데이터 처리 완료: Event ID = {}", dto.getEventID());

파일 보기

@ -5,9 +5,8 @@ import com.snp.batch.jobs.event.batch.entity.CargoEntity;
import com.snp.batch.jobs.event.batch.entity.EventDetailEntity;
import com.snp.batch.jobs.event.batch.entity.HumanCasualtyEntity;
import com.snp.batch.jobs.event.batch.entity.RelationshipEntity;
import com.snp.batch.jobs.shipdetail.batch.entity.GroupBeneficialOwnerHistoryEntity;
import com.snp.batch.jobs.shipdetail.batch.repository.ShipDetailSql;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -20,13 +19,24 @@ import java.util.List;
@Repository("EventRepository")
public class EventRepositoryImpl extends BaseJdbcRepository<EventDetailEntity, Long> implements EventRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.event-001}")
private String tableName;
public EventRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return null;
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -182,6 +192,8 @@ public class EventRepositoryImpl extends BaseJdbcRepository<EventDetailEntity, L
ps.setString(idx++, entity.getVesselName()); // vessel_name (누락됨)
ps.setString(idx++, entity.getVesselType()); // vessel_type (누락됨)
ps.setString(idx++, entity.getVesselTypeDecode()); // vessel_type_decode
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
private void setCargoInsertParameters(PreparedStatement ps, CargoEntity entity)throws Exception{
int idx = 1;
@ -196,6 +208,8 @@ public class EventRepositoryImpl extends BaseJdbcRepository<EventDetailEntity, L
ps.setString(idx++, entity.getCargoDamage());
ps.setString(idx++, entity.getDangerous());
ps.setString(idx++, entity.getText());
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
private void setHumanCasualtyInsertParameters(PreparedStatement ps, HumanCasualtyEntity entity)throws Exception{
int idx = 1;
@ -204,6 +218,8 @@ public class EventRepositoryImpl extends BaseJdbcRepository<EventDetailEntity, L
ps.setString(idx++, entity.getType());
ps.setString(idx++, entity.getQualifier());
ps.setObject(idx++, entity.getCount());
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
private void setRelationshipInsertParameters(PreparedStatement ps, RelationshipEntity entity)throws Exception{
int idx = 1;
@ -214,6 +230,8 @@ public class EventRepositoryImpl extends BaseJdbcRepository<EventDetailEntity, L
ps.setObject(idx++, entity.getEventID2());
ps.setString(idx++, entity.getEventType());
ps.setString(idx++, entity.getEventTypeCode());
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
private static void setStringOrNull(PreparedStatement ps, int index, String value) throws Exception {

파일 보기

@ -1,122 +1,116 @@
package com.snp.batch.jobs.event.batch.repository;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
/**
* Event 관련 SQL 생성 클래스
* application.yml의 app.batch.target-schema.name 값을 사용
*/
@Component
public class EventSql {
private static String targetSchema;
private static String eventTable;
private static String eventCargoTable;
private static String eventRelationshipTable;
private static String eventHumanCasualtyTable;
@Value("${app.batch.target-schema.name}")
public void setTargetSchema(String schema) {
EventSql.targetSchema = schema;
}
@Value("${app.batch.target-schema.tables.event-001}")
public void setEventTable(String table) {
EventSql.eventTable = table;
}
@Value("${app.batch.target-schema.tables.event-002}")
public void setEventCargoTable(String table) {
EventSql.eventCargoTable = table;
}
@Value("${app.batch.target-schema.tables.event-004}")
public void setEventRelationshipTable(String table) {
EventSql.eventRelationshipTable = table;
}
@Value("${app.batch.target-schema.tables.event-003}")
public void setEventHumanCasualtyTable(String table) {
EventSql.eventHumanCasualtyTable = table;
}
public static String getTargetSchema() {
return targetSchema;
}
public static String getEventDetailUpdateSql(){
return """
INSERT INTO new_snp.event (
event_id, incident_id, ihslrorimoshipno, published_date, event_start_date, event_end_date,
attempted_boarding, cargo_loading_status_code, casualty_action,
casualty_zone, casualty_zone_code, component2, country_code,
date_of_build, description, environment_location, location_name,
marsden_grid_reference, town_name, event_type, event_type_detail,
event_type_detail_id, event_type_id, fired_upon, headline,
ldt_at_time, significance, weather, pollutant, pollutant_quantity,
pollutant_unit, registered_owner_code_at_time, registered_owner_at_time,
registered_owner_country_code_at_time, registered_owner_country_at_time,
vessel_dwt, vessel_flag_code, vessel_flag_decode, vessel_gt,
vessel_name, vessel_type, vessel_type_decode
INSERT INTO %s.%s (
event_id, acdnt_id, imo_no, pstg_ymd, event_start_day, event_end_day,
embrk_try_yn, cargo_capacity_status_cd, acdnt_actn,
acdnt_zone, acdnt_zone_cd, cfg_cmpnt_two, country_cd,
build_ymd, event_expln, env_position, position_nm,
masd_grid_ref, cty_nm, event_type, event_type_dtl,
event_type_dtl_id, event_type_id, firedupon_yn, sj,
ldt_timpt, signfct, wethr, pltn_matral, pltn_matral_cnt,
pltn_matral_unit, reg_shponr_cd_hr, reg_shponr_hr,
reg_shponr_country_cd_hr, reg_shponr_country_hr,
ship_dwt, ship_flg_cd, ship_flg_decd, ship_gt,
ship_nm, ship_type, ship_type_nm,
job_execution_id, creatr_id
)
VALUES (
?, ?, ?, ?::timestamptz,?::timestamptz,?::timestamptz, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?
)
ON CONFLICT (event_id)
DO UPDATE SET
incident_id = EXCLUDED.incident_id,
ihslrorimoshipno = EXCLUDED.ihslrorimoshipno,
published_date = EXCLUDED.published_date,
event_start_date = EXCLUDED.event_start_date,
event_end_date = EXCLUDED.event_end_date,
attempted_boarding = EXCLUDED.attempted_boarding,
cargo_loading_status_code = EXCLUDED.cargo_loading_status_code,
casualty_action = EXCLUDED.casualty_action,
casualty_zone = EXCLUDED.casualty_zone,
casualty_zone_code = EXCLUDED.casualty_zone_code,
component2 = EXCLUDED.component2,
country_code = EXCLUDED.country_code,
date_of_build = EXCLUDED.date_of_build,
description = EXCLUDED.description,
environment_location = EXCLUDED.environment_location,
location_name = EXCLUDED.location_name,
marsden_grid_reference = EXCLUDED.marsden_grid_reference,
town_name = EXCLUDED.town_name,
event_type = EXCLUDED.event_type,
event_type_detail = EXCLUDED.event_type_detail,
event_type_detail_id = EXCLUDED.event_type_detail_id,
event_type_id = EXCLUDED.event_type_id,
fired_upon = EXCLUDED.fired_upon,
headline = EXCLUDED.headline,
ldt_at_time = EXCLUDED.ldt_at_time,
significance = EXCLUDED.significance,
weather = EXCLUDED.weather,
pollutant = EXCLUDED.pollutant,
pollutant_quantity = EXCLUDED.pollutant_quantity,
pollutant_unit = EXCLUDED.pollutant_unit,
registered_owner_code_at_time = EXCLUDED.registered_owner_code_at_time,
registered_owner_at_time = EXCLUDED.registered_owner_at_time,
registered_owner_country_code_at_time = EXCLUDED.registered_owner_country_code_at_time,
registered_owner_country_at_time = EXCLUDED.registered_owner_country_at_time,
vessel_dwt = EXCLUDED.vessel_dwt,
vessel_flag_code = EXCLUDED.vessel_flag_code,
vessel_flag_decode = EXCLUDED.vessel_flag_decode,
vessel_gt = EXCLUDED.vessel_gt,
vessel_name = EXCLUDED.vessel_name,
vessel_type = EXCLUDED.vessel_type,
vessel_type_decode = EXCLUDED.vessel_type_decode
""";
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?
);
""".formatted(targetSchema, eventTable);
}
public static String getEventCargoSql(){
return """
INSERT INTO new_snp.event_cargo (
event_id, "sequence", ihslrorimoshipno, "type", quantity,
unit_short, unit, cargo_damage, dangerous, "text"
INSERT INTO %s.%s (
event_id, event_seq, imo_no, "type", cnt,
unit_abbr, unit, cargo_damg, risk_yn, "text",
job_execution_id, creatr_id
)
VALUES (
?, ?, ?, ?, ?,
?, ?, ?, ?, ?
)
ON CONFLICT (event_id, ihslrorimoshipno, "type", "sequence")
DO UPDATE SET
quantity = EXCLUDED.quantity,
unit_short = EXCLUDED.unit_short,
unit = EXCLUDED.unit,
cargo_damage = EXCLUDED.cargo_damage,
dangerous = EXCLUDED.dangerous,
"text" = EXCLUDED."text"
""";
?, ?, ?, ?, ?,
?, ?
);
""".formatted(targetSchema, eventCargoTable);
}
public static String getEventRelationshipSql(){
return """
INSERT INTO new_snp.event_relationship (
incident_id, event_id, relationship_type, relationship_type_code,
event_id_2, event_type, event_type_code
INSERT INTO %s.%s (
acdnt_id, event_id, rel_type, rel_type_cd,
event_id_two, event_type, event_type_cd,
job_execution_id, creatr_id
)
VALUES (
?, ?, ?, ?,
?, ?, ?
)
ON CONFLICT (incident_id, event_id, event_id_2, event_type_code, relationship_type_code)
DO UPDATE SET
relationship_type = EXCLUDED.relationship_type,
event_type = EXCLUDED.event_type
""";
?, ?, ?,
?, ?
);
""".formatted(targetSchema, eventRelationshipTable);
}
public static String getEventHumanCasualtySql(){
return """
INSERT INTO new_snp.event_humancasualty (
event_id, "scope", "type", qualifier, "count"
INSERT INTO %s.%s (
event_id, "scope", "type", qualfr, cnt,
job_execution_id, creatr_id
)
VALUES (
?, ?, ?, ?, ?
)
ON CONFLICT (event_id, "scope", "type", qualifier)
DO UPDATE SET
"count" = EXCLUDED."count"
""";
?, ?, ?, ?, ?,
?, ?
);
""".formatted(targetSchema, eventHumanCasualtyTable);
}
}

파일 보기

@ -6,14 +6,17 @@ import com.snp.batch.jobs.facility.batch.entity.PortEntity;
import com.snp.batch.jobs.facility.batch.processor.PortDataProcessor;
import com.snp.batch.jobs.facility.batch.reader.PortDataReader;
import com.snp.batch.jobs.facility.batch.writer.PortDataWriter;
import com.snp.batch.service.BatchApiLogService;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
@ -26,9 +29,12 @@ public class PortImportJobConfig extends BaseJobConfig<PortDto, PortEntity> {
private final JdbcTemplate jdbcTemplate;
private final WebClient maritimeServiceApiWebClient;
private final PortDataProcessor portDataProcessor;
private final PortDataWriter portDataWriter;
private final PortDataReader portDataReader;
private final BatchApiLogService batchApiLogService;
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Override
protected int getChunkSize() {
@ -37,46 +43,65 @@ public class PortImportJobConfig extends BaseJobConfig<PortDto, PortEntity> {
public PortImportJobConfig(
JobRepository jobRepository,
PlatformTransactionManager transactionManager,
PortDataProcessor portDataProcessor,
PortDataReader portDataReader,
PortDataWriter portDataWriter,
JdbcTemplate jdbcTemplate,
@Qualifier("maritimeServiceApiWebClient")WebClient maritimeServiceApiWebClient) {
@Qualifier("maritimeServiceApiWebClient")WebClient maritimeServiceApiWebClient,
BatchApiLogService batchApiLogService) {
super(jobRepository, transactionManager);
this.jdbcTemplate = jdbcTemplate;
this.maritimeServiceApiWebClient = maritimeServiceApiWebClient;
this.portDataProcessor = portDataProcessor;
this.portDataWriter = portDataWriter;
this.portDataReader = portDataReader;
this.batchApiLogService = batchApiLogService;
}
@Override
protected String getJobName() {
return "portImportJob";
return "PortImportJob";
}
@Override
protected String getStepName() {
return "portImportStep";
return "PortImportStep";
}
@Override
protected ItemReader<PortDto> createReader() {
return new PortDataReader(maritimeServiceApiWebClient, jdbcTemplate);
return portDataReader;
}
@Bean
@StepScope
public PortDataReader portDataReader(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId, // SpEL로 ID 추출
@Value("#{stepExecution.id}") Long stepExecutionId
) {
PortDataReader reader = new PortDataReader(maritimeServiceApiWebClient, jdbcTemplate, batchApiLogService, maritimeServiceApiUrl);
reader.setExecutionIds(jobExecutionId, stepExecutionId); // ID 세팅
return reader;
}
@Override
protected ItemProcessor<PortDto, PortEntity> createProcessor() {
return portDataProcessor;
// 2. 메서드 호출 방식으로 변경
return portDataProcessor(null);
}
@Bean
@StepScope
public PortDataProcessor portDataProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
return new PortDataProcessor(jobExecutionId);
}
@Override
protected ItemWriter<PortEntity> createWriter() { return portDataWriter; }
@Bean(name = "portImportJob")
@Bean(name = "PortImportJob")
public Job portImportJob() {
return job();
}
@Bean(name = "portImportStep")
@Bean(name = "PortImportStep")
public Step portImportStep() {
return step();
}

파일 보기

@ -4,11 +4,17 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.facility.batch.dto.PortDto;
import com.snp.batch.jobs.facility.batch.entity.PortEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
@Slf4j
@Component
public class PortDataProcessor extends BaseProcessor<PortDto, PortEntity> {
private static Long jobExecutionId;
public PortDataProcessor(@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId) {
this.jobExecutionId = jobExecutionId;
}
@Override
protected PortEntity processItem(PortDto dto) throws Exception {
log.debug("Port 데이터 처리 시작: Port ID = {}", dto.getPortId());
@ -71,6 +77,8 @@ public class PortDataProcessor extends BaseProcessor<PortDto, PortEntity> {
.freeTradeZone(dto.getFreeTradeZone())
.ecoPort(dto.getEcoPort())
.emissionControlArea(dto.getEmissionControlArea())
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
log.debug("Port 데이터 처리 완료: Port ID = {}", dto.getPortId());

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.facility.batch.reader;
import com.snp.batch.common.batch.reader.BaseApiReader;
import com.snp.batch.jobs.facility.batch.dto.PortDto;
import com.snp.batch.jobs.shipimport.batch.dto.ShipApiResponse;
import com.snp.batch.service.BatchApiLogService;
import lombok.extern.slf4j.Slf4j;
import org.springframework.core.ParameterizedTypeReference;
import org.springframework.jdbc.core.JdbcTemplate;
@ -17,11 +18,14 @@ public class PortDataReader extends BaseApiReader<PortDto> {
private final JdbcTemplate jdbcTemplate;
private List<String> allImoNumbers;
private int currentBatchIndex = 0;
private final int batchSize = 100;
public PortDataReader(WebClient webClient, JdbcTemplate jdbcTemplate) {
private final int batchSize = 5000;
private final BatchApiLogService batchApiLogService;
String maritimeServiceApiUrl;
public PortDataReader(WebClient webClient, JdbcTemplate jdbcTemplate, BatchApiLogService batchApiLogService, String maritimeServiceApiUrl) {
super(webClient);
this.jdbcTemplate = jdbcTemplate;
this.batchApiLogService = batchApiLogService;
this.maritimeServiceApiUrl = maritimeServiceApiUrl;
}
@Override
@ -57,13 +61,12 @@ public class PortDataReader extends BaseApiReader<PortDto> {
}
private List<PortDto> callFacilityPortApiWithBatch() {
String url = getApiPath();
log.debug("[{}] API 호출: {}", getReaderName(), url);
return webClient.get()
.uri(url)
.retrieve()
.bodyToMono(new ParameterizedTypeReference<List<PortDto>>() {})
.block();
return executeListApiCall(
maritimeServiceApiUrl,
getApiPath(),
new ParameterizedTypeReference<List<PortDto>>() {},
batchApiLogService
);
}
}

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.facility.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.facility.batch.entity.PortEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -15,13 +16,24 @@ import java.util.List;
@Repository("FacilityRepository")
public class FacilityRepositoryImpl extends BaseJdbcRepository<PortEntity, Long> implements FacilityRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.facility-001}")
private String tableName;
public FacilityRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return null;
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -42,73 +54,24 @@ public class FacilityRepositoryImpl extends BaseJdbcRepository<PortEntity, Long>
@Override
protected String getUpdateSql() {
return """
INSERT INTO snp_data.facility_port (
port_ID, old_ID, status, port_Name, unlocode, countryCode, country_Name, region_Name, continent_Name, master_POID,
dec_Lat, dec_Long, position_lat, position_long, position_z, position_m, position_hasZ, position_hasM, position_isNull, position_stSrid, time_Zone, dayLight_Saving_Time,
maximum_Draft, max_LOA, max_Beam, max_DWT, max_Offshore_Draught, max_Offshore_LOA, max_Offshore_BCM, max_Offshore_DWT,
breakbulk_Facilities, container_Facilities, dry_Bulk_Facilities, liquid_Facilities, roRo_Facilities, passenger_Facilities, dry_Dock_Facilities,
lpG_Facilities, lnG_Facilities, lnG_Bunker, dO_Bunker, fO_Bunker, ispS_Compliant, csI_Compliant, free_Trade_Zone, ecO_Port, emission_Control_Area, wS_Port,
last_Update, entry_Date, batch_flag
INSERT INTO %s(
port_id, bfr_id, status, port_nm, un_port_cd, country_cd, country_nm, areanm, cntntnm, mst_port_id,
lat_decml, lon_decml, position_lat, position_lon, position_z_val, position_mval_val, z_val_has_yn, mval_val_has_yn, position_nul_yn, position_sts_id, hr_zone, daylgt_save_hr,
max_draft, max_whlnth, max_beam, max_dwt, max_sea_draft, max_sea_whlnth, max_sea_bcm, max_sea_dwt,
bale_cargo_facility, cntnr_facility, case_cargo_facility, liquid_cargo_facility, roro_facility, paxfclty, drydkfclty,
lpg_facility, lng_facility, lng_bnkr, do_bnkr, fo_bnkr, isps_compliance_yn, csi_compliance_yn, free_trd_zone, ecfrd_port, emsn_ctrl_area, ws_port,
last_mdfcn_dt, reg_ymd,
job_execution_id, creatr_id
) VALUES (
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?::timestamptz, ?::timestamptz, 'N'
) ON CONFLICT (port_ID) DO UPDATE
SET
old_ID = EXCLUDED.old_ID,
status = EXCLUDED.status,
port_Name = EXCLUDED.port_Name,
unlocode = EXCLUDED.unlocode,
countryCode = EXCLUDED.countryCode,
country_Name = EXCLUDED.country_Name,
region_Name = EXCLUDED.region_Name,
continent_Name = EXCLUDED.continent_Name,
master_POID = EXCLUDED.master_POID,
dec_Lat = EXCLUDED.dec_Lat,
dec_Long = EXCLUDED.dec_Long,
position_lat = EXCLUDED.position_lat,
position_long = EXCLUDED.position_long,
position_z = EXCLUDED.position_z,
position_m = EXCLUDED.position_m,
position_hasZ = EXCLUDED.position_hasZ,
position_hasM = EXCLUDED.position_hasM,
position_isNull = EXCLUDED.position_isNull,
position_stSrid = EXCLUDED.position_stSrid,
time_Zone = EXCLUDED.time_Zone,
dayLight_Saving_Time = EXCLUDED.dayLight_Saving_Time,
maximum_Draft = EXCLUDED.maximum_Draft,
max_LOA = EXCLUDED.max_LOA,
max_Beam = EXCLUDED.max_Beam,
max_DWT = EXCLUDED.max_DWT,
max_Offshore_Draught = EXCLUDED.max_Offshore_Draught,
max_Offshore_LOA = EXCLUDED.max_Offshore_LOA,
max_Offshore_BCM = EXCLUDED.max_Offshore_BCM,
max_Offshore_DWT = EXCLUDED.max_Offshore_DWT,
breakbulk_Facilities = EXCLUDED.breakbulk_Facilities,
container_Facilities = EXCLUDED.container_Facilities,
dry_Bulk_Facilities = EXCLUDED.dry_Bulk_Facilities,
liquid_Facilities = EXCLUDED.liquid_Facilities,
roRo_Facilities = EXCLUDED.roRo_Facilities,
passenger_Facilities = EXCLUDED.passenger_Facilities,
dry_Dock_Facilities = EXCLUDED.dry_Dock_Facilities,
lpG_Facilities = EXCLUDED.lpG_Facilities,
lnG_Facilities = EXCLUDED.lnG_Facilities,
lnG_Bunker = EXCLUDED.lnG_Bunker,
dO_Bunker = EXCLUDED.dO_Bunker,
fO_Bunker = EXCLUDED.fO_Bunker,
ispS_Compliant = EXCLUDED.ispS_Compliant,
csI_Compliant = EXCLUDED.csI_Compliant,
free_Trade_Zone = EXCLUDED.free_Trade_Zone,
ecO_Port = EXCLUDED.ecO_Port,
emission_Control_Area = EXCLUDED.emission_Control_Area,
wS_Port = EXCLUDED.wS_Port,
last_Update = EXCLUDED.last_Update,
entry_Date = EXCLUDED.entry_Date,
batch_flag = 'N'
""";
?::timestamptz, ?::timestamptz,
?, ?
);
""".formatted(getTableName());
}
@Override
@ -176,6 +139,8 @@ public class FacilityRepositoryImpl extends BaseJdbcRepository<PortEntity, Long>
setIntegerOrNull(ps, idx++, entity.getWsPort()); // 원본 위치: 마지막 부분 (INT8에 맞게 setLongOrNull 사용 가정)
ps.setString(idx++, entity.getLastUpdate()); // String 대신 Timestamp 타입이 JDBC 표준에 적합합니다.
ps.setString(idx++, entity.getEntryDate()); // String 대신 Timestamp 타입이 JDBC 표준에 적합합니다.
ps.setObject(idx++, entity.getJobExecutionId(), Types.INTEGER);
ps.setString(idx++, entity.getCreatedBy());
}
@Override

파일 보기

@ -1,5 +1,6 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseMultiStepJobConfig;
import com.snp.batch.jobs.movement.batch.dto.AnchorageCallsDto;
import com.snp.batch.jobs.movement.batch.entity.AnchorageCallsEntity;
@ -39,12 +40,17 @@ public class AnchorageCallsRangeJobConfig extends BaseMultiStepJobConfig<Anchora
private final JdbcTemplate jdbcTemplate;
private final BatchDateService batchDateService;
private final BatchApiLogService batchApiLogService;
private final ObjectMapper objectMapper;
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "ANCHORAGE_CALLS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public AnchorageCallsRangeJobConfig(
@ -56,7 +62,8 @@ public class AnchorageCallsRangeJobConfig extends BaseMultiStepJobConfig<Anchora
@Qualifier("maritimeServiceApiWebClient")WebClient maritimeServiceApiWebClient,
JdbcTemplate jdbcTemplate,
BatchDateService batchDateService,
BatchApiLogService batchApiLogService
BatchApiLogService batchApiLogService,
ObjectMapper objectMapper
) {
super(jobRepository, transactionManager);
this.anchorageCallsProcessor = anchorageCallsProcessor;
@ -66,6 +73,7 @@ public class AnchorageCallsRangeJobConfig extends BaseMultiStepJobConfig<Anchora
this.jdbcTemplate = jdbcTemplate;
this.batchDateService = batchDateService;
this.batchApiLogService = batchApiLogService;
this.objectMapper = objectMapper;
}
@Override
@ -106,6 +114,14 @@ public class AnchorageCallsRangeJobConfig extends BaseMultiStepJobConfig<Anchora
protected ItemProcessor<AnchorageCallsDto, AnchorageCallsEntity> createProcessor() {
return anchorageCallsProcessor;
}
@Bean
@StepScope
public AnchorageCallsProcessor anchorageCallsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper
) {
return new AnchorageCallsProcessor(jobExecutionId, objectMapper);
}
@Override
protected ItemWriter<AnchorageCallsEntity> createWriter() { // 타입 변경

파일 보기

@ -1,5 +1,6 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseMultiStepJobConfig;
import com.snp.batch.jobs.movement.batch.dto.BerthCallsDto;
import com.snp.batch.jobs.movement.batch.entity.BerthCallsEntity;
@ -42,9 +43,13 @@ public class BerthCallsRangJobConfig extends BaseMultiStepJobConfig<BerthCallsDt
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "BERTH_CALLS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public BerthCallsRangJobConfig(
JobRepository jobRepository,
@ -104,6 +109,15 @@ public class BerthCallsRangJobConfig extends BaseMultiStepJobConfig<BerthCallsDt
return berthCallsProcessor;
}
@Bean
@StepScope
public BerthCallsProcessor berthCallsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper
) {
return new BerthCallsProcessor(jobExecutionId, objectMapper);
}
@Override
protected ItemWriter<BerthCallsEntity> createWriter() {
return berthCallsWriter;

파일 보기

@ -1,5 +1,6 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseMultiStepJobConfig;
import com.snp.batch.jobs.movement.batch.dto.CurrentlyAtDto;
import com.snp.batch.jobs.movement.batch.entity.CurrentlyAtEntity;
@ -42,9 +43,13 @@ public class CurrentlyAtRangeJobConfig extends BaseMultiStepJobConfig<CurrentlyA
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "CURRENTLY_AT_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public CurrentlyAtRangeJobConfig(
JobRepository jobRepository,
@ -103,7 +108,14 @@ public class CurrentlyAtRangeJobConfig extends BaseMultiStepJobConfig<CurrentlyA
protected ItemProcessor<CurrentlyAtDto, CurrentlyAtEntity> createProcessor() {
return currentlyAtProcessor;
}
@Bean
@StepScope
public CurrentlyAtProcessor currentlyAtProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper
) {
return new CurrentlyAtProcessor(jobExecutionId, objectMapper);
}
@Override
protected ItemWriter<CurrentlyAtEntity> createWriter() { // 타입 변경
return currentlyAtWriter;

파일 보기

@ -1,106 +0,0 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.jobs.movement.batch.processor.DarkActivityProcessor;
import com.snp.batch.jobs.movement.batch.reader.DarkActivityReader;
import com.snp.batch.jobs.movement.batch.writer.DarkActivityWriter;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.web.reactive.function.client.WebClient;
/**
* 선박 상세 정보 Import Job Config
*
* 특징:
* - ship_data 테이블에서 IMO 번호 조회
* - IMO 번호를 100개씩 배치로 분할
* - Maritime API GetShipsByIHSLRorIMONumbers 호출
* TODO : GetShipsByIHSLRorIMONumbersAll 호출로 변경
* - 선박 상세 정보를 ship_detail 테이블에 저장 (UPSERT)
*
* 데이터 흐름:
* DarkActivityReader (ship_data Maritime API)
* (DarkActivityDto)
* DarkActivityProcessor
* (DarkActivityEntity)
* DarkActivityWriter
* (t_darkactivity 테이블)
*/
@Slf4j
@Configuration
public class DarkActivityJobConfig extends BaseJobConfig<DarkActivityDto, DarkActivityEntity> {
private final DarkActivityProcessor darkActivityProcessor;
private final DarkActivityWriter darkActivityWriter;
private final JdbcTemplate jdbcTemplate;
private final WebClient maritimeApiWebClient;
public DarkActivityJobConfig(
JobRepository jobRepository,
PlatformTransactionManager transactionManager,
DarkActivityProcessor darkActivityProcessor,
DarkActivityWriter darkActivityWriter, JdbcTemplate jdbcTemplate,
@Qualifier("maritimeServiceApiWebClient") WebClient maritimeApiWebClient,
ObjectMapper objectMapper) { // ObjectMapper 주입 추가
super(jobRepository, transactionManager);
this.darkActivityProcessor = darkActivityProcessor;
this.darkActivityWriter = darkActivityWriter;
this.jdbcTemplate = jdbcTemplate;
this.maritimeApiWebClient = maritimeApiWebClient;
}
@Override
protected String getJobName() {
return "DarkActivityImportJob";
}
@Override
protected String getStepName() {
return "DarkActivityImportStep";
}
@Override
protected ItemReader<DarkActivityDto> createReader() { // 타입 변경
// Reader 생성자 수정: ObjectMapper를 전달합니다.
return new DarkActivityReader(maritimeApiWebClient, jdbcTemplate);
}
@Override
protected ItemProcessor<DarkActivityDto, DarkActivityEntity> createProcessor() {
return darkActivityProcessor;
}
@Override
protected ItemWriter<DarkActivityEntity> createWriter() { // 타입 변경
return darkActivityWriter;
}
@Override
protected int getChunkSize() {
return 5; // API에서 100개씩 가져오므로 chunk도 100으로 설정
}
@Bean(name = "DarkActivityImportJob")
public Job darkActivityImportJob() {
return job();
}
@Bean(name = "DarkActivityImportStep")
public Step darkActivityImportStep() {
return step();
}
}

파일 보기

@ -1,119 +0,0 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import com.snp.batch.jobs.movement.batch.processor.DarkActivityProcessor;
import com.snp.batch.jobs.movement.batch.writer.DarkActivityWriter;
import com.snp.batch.jobs.movement.batch.reader.DarkActivityRangeReader;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.web.reactive.function.client.WebClient;
/**
* 선박 상세 정보 Import Job Config
*
* 특징:
* - ship_data 테이블에서 IMO 번호 조회
* - IMO 번호를 100개씩 배치로 분할
* - Maritime API GetShipsByIHSLRorIMONumbers 호출
* TODO : GetShipsByIHSLRorIMONumbersAll 호출로 변경
* - 선박 상세 정보를 ship_detail 테이블에 저장 (UPSERT)
*
* 데이터 흐름:
* DarkActivityReader (ship_data Maritime API)
* (DarkActivityDto)
* DarkActivityProcessor
* (DarkActivityEntity)
* DarkActivityWriter
* (t_darkactivity 테이블)
*/
@Slf4j
@Configuration
public class DarkActivityRangeJobConfig extends BaseJobConfig<DarkActivityDto, DarkActivityEntity> {
private final DarkActivityProcessor darkActivityProcessor;
private final DarkActivityWriter darkActivityWriter;
private final DarkActivityRangeReader darkActivityRangeReader;
private final JdbcTemplate jdbcTemplate;
private final WebClient maritimeApiWebClient;
public DarkActivityRangeJobConfig(
JobRepository jobRepository,
PlatformTransactionManager transactionManager,
DarkActivityProcessor darkActivityProcessor,
DarkActivityWriter darkActivityWriter, JdbcTemplate jdbcTemplate,
@Qualifier("maritimeServiceApiWebClient") WebClient maritimeApiWebClient,
ObjectMapper objectMapper, DarkActivityRangeReader darkActivityRangeReader) { // ObjectMapper 주입 추가
super(jobRepository, transactionManager);
this.darkActivityProcessor = darkActivityProcessor;
this.darkActivityWriter = darkActivityWriter;
this.jdbcTemplate = jdbcTemplate;
this.maritimeApiWebClient = maritimeApiWebClient;
this.darkActivityRangeReader = darkActivityRangeReader;
}
@Override
protected String getJobName() {
return "DarkActivityRangeImportJob";
}
@Override
protected String getStepName() {
return "DarkActivityRangeImportStep";
}
@Override
protected ItemReader<DarkActivityDto> createReader() { // 타입 변경
// Reader 생성자 수정: ObjectMapper를 전달합니다.
return darkActivityRangeReader;
}
@Bean
@StepScope
public DarkActivityRangeReader darkActivityReader(
@Value("#{jobParameters['startDate']}") String startDate,
@Value("#{jobParameters['stopDate']}") String stopDate
) {
// jobParameters 없으면 null 넘어오고 Reader에서 default 처리
return new DarkActivityRangeReader(maritimeApiWebClient, startDate, stopDate);
}
@Override
protected ItemProcessor<DarkActivityDto, DarkActivityEntity> createProcessor() {
return darkActivityProcessor;
}
@Override
protected ItemWriter<DarkActivityEntity> createWriter() { // 타입 변경
return darkActivityWriter;
}
@Override
protected int getChunkSize() {
return 5000; // API에서 100개씩 가져오므로 chunk도 100으로 설정
}
@Bean(name = "DarkActivityRangeImportJob")
public Job darkActivityRangeImportJob() {
return job();
}
@Bean(name = "DarkActivityRangeImportStep")
public Step darkActivityRangeImportStep() {
return step();
}
}

파일 보기

@ -1,5 +1,6 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseMultiStepJobConfig;
import com.snp.batch.jobs.movement.batch.dto.DestinationDto;
import com.snp.batch.jobs.movement.batch.entity.DestinationEntity;
@ -32,8 +33,8 @@ import org.springframework.web.reactive.function.client.WebClient;
@Configuration
public class DestinationsRangeJobConfig extends BaseMultiStepJobConfig<DestinationDto, DestinationEntity> {
private final DestinationProcessor DestinationProcessor;
private final DestinationWriter DestinationWriter;
private final DestinationProcessor destinationProcessor;
private final DestinationWriter destinationWriter;
private final DestinationRangeReader destinationRangeReader;
private final WebClient maritimeApiWebClient;
private final JdbcTemplate jdbcTemplate;
@ -42,16 +43,20 @@ public class DestinationsRangeJobConfig extends BaseMultiStepJobConfig<Destinati
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "DESTINATIONS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public DestinationsRangeJobConfig(
JobRepository jobRepository,
PlatformTransactionManager transactionManager,
DestinationProcessor DestinationProcessor,
DestinationWriter DestinationWriter,
DestinationProcessor destinationProcessor,
DestinationWriter destinationWriter,
DestinationRangeReader destinationRangeReader,
@Qualifier("maritimeServiceApiWebClient") WebClient maritimeApiWebClient,
JdbcTemplate jdbcTemplate,
@ -59,8 +64,8 @@ public class DestinationsRangeJobConfig extends BaseMultiStepJobConfig<Destinati
BatchApiLogService batchApiLogService
) { // ObjectMapper 주입 추가
super(jobRepository, transactionManager);
this.DestinationProcessor = DestinationProcessor;
this.DestinationWriter = DestinationWriter;
this.destinationProcessor = destinationProcessor;
this.destinationWriter = destinationWriter;
this.destinationRangeReader = destinationRangeReader;
this.maritimeApiWebClient = maritimeApiWebClient;
this.jdbcTemplate = jdbcTemplate;
@ -102,12 +107,21 @@ public class DestinationsRangeJobConfig extends BaseMultiStepJobConfig<Destinati
}
@Override
protected ItemProcessor<DestinationDto, DestinationEntity> createProcessor() {
return DestinationProcessor;
return destinationProcessor;
}
@Bean
@StepScope
public DestinationProcessor destinationProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper
) {
return new DestinationProcessor(jobExecutionId, objectMapper);
}
@Override
protected ItemWriter<DestinationEntity> createWriter() { // 타입 변경
return DestinationWriter;
return destinationWriter;
}
@Override

파일 보기

@ -1,5 +1,6 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseMultiStepJobConfig;
import com.snp.batch.jobs.movement.batch.dto.PortCallsDto;
import com.snp.batch.jobs.movement.batch.entity.PortCallsEntity;
@ -42,9 +43,13 @@ public class ShipPortCallsRangeJobConfig extends BaseMultiStepJobConfig<PortCall
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "PORT_CALLS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public ShipPortCallsRangeJobConfig(
JobRepository jobRepository,
@ -106,6 +111,15 @@ public class ShipPortCallsRangeJobConfig extends BaseMultiStepJobConfig<PortCall
return portCallsProcessor;
}
@Bean
@StepScope
public PortCallsProcessor portCallsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper
) {
return new PortCallsProcessor(jobExecutionId, objectMapper);
}
@Override
protected ItemWriter<PortCallsEntity> createWriter() { // 타입 변경
return portCallsWriter;

파일 보기

@ -1,5 +1,6 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseMultiStepJobConfig;
import com.snp.batch.jobs.movement.batch.dto.StsOperationDto;
import com.snp.batch.jobs.movement.batch.entity.StsOperationEntity;
@ -42,9 +43,13 @@ public class StsOperationRangeJobConfig extends BaseMultiStepJobConfig<StsOperat
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "STS_OPERATION_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public StsOperationRangeJobConfig(
@ -104,6 +109,14 @@ public class StsOperationRangeJobConfig extends BaseMultiStepJobConfig<StsOperat
protected ItemProcessor<StsOperationDto, StsOperationEntity> createProcessor() {
return stsOperationProcessor;
}
@Bean
@StepScope
public StsOperationProcessor stsOperationProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper
) {
return new StsOperationProcessor(jobExecutionId, objectMapper);
}
@Override
protected ItemWriter<StsOperationEntity> createWriter() { // 타입 변경

파일 보기

@ -1,5 +1,6 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseMultiStepJobConfig;
import com.snp.batch.jobs.movement.batch.dto.TerminalCallsDto;
import com.snp.batch.jobs.movement.batch.entity.TerminalCallsEntity;
@ -42,9 +43,13 @@ public class TerminalCallsRangeJobConfig extends BaseMultiStepJobConfig<Terminal
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "TERMINAL_CALLS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public TerminalCallsRangeJobConfig(
@ -104,6 +109,14 @@ public class TerminalCallsRangeJobConfig extends BaseMultiStepJobConfig<Terminal
protected ItemProcessor<TerminalCallsDto, TerminalCallsEntity> createProcessor() {
return terminalCallsProcessor;
}
@Bean
@StepScope
public TerminalCallsProcessor terminalCallsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper
) {
return new TerminalCallsProcessor(jobExecutionId, objectMapper);
}
@Override
protected ItemWriter<TerminalCallsEntity> createWriter() { // 타입 변경

파일 보기

@ -42,9 +42,13 @@ public class TransitsRangeJobConfig extends BaseMultiStepJobConfig<TransitsDto,
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "TRANSITS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public TransitsRangeJobConfig(
JobRepository jobRepository,
@ -104,6 +108,14 @@ public class TransitsRangeJobConfig extends BaseMultiStepJobConfig<TransitsDto,
return transitsProcessor;
}
@Bean
@StepScope
public TransitsProcessor transitsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId
) {
return new TransitsProcessor(jobExecutionId);
}
@Override
protected ItemWriter<TransitsEntity> createWriter() { // 타입 변경
return transitsWriter;

파일 보기

@ -1,29 +0,0 @@
package com.snp.batch.jobs.movement.batch.dto;
import lombok.Data;
@Data
public class DarkActivityDto {
private String movementType;
private String imolRorIHSNumber;
private String movementDate;
private Integer facilityId;
private String facilityName;
private String facilityType;
private Integer subFacilityId;
private String subFacilityName;
private String subFacilityType;
private String countryCode;
private String countryName;
private Double draught;
private Double latitude;
private Double longitude;
private DarkActivityPositionDto position;
private String eventStartDate;
}

파일 보기

@ -1,17 +0,0 @@
package com.snp.batch.jobs.movement.batch.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import lombok.Data;
@Data
public class DarkActivityPositionDto {
private boolean isNull;
private int stSrid;
private double lat;
@JsonProperty("long")
private double lon;
private double z;
private double m;
private boolean hasZ;
private boolean hasM;
}

파일 보기

@ -1,12 +1,10 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.databind.JsonNode;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import jakarta.persistence.SequenceGenerator;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
@ -16,7 +14,8 @@ import java.time.LocalDateTime;
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class AnchorageCallsEntity {
@EqualsAndHashCode(callSuper = true)
public class AnchorageCallsEntity extends BaseEntity {
private Long id;

파일 보기

@ -1,12 +1,10 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.databind.JsonNode;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import jakarta.persistence.SequenceGenerator;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
@ -16,7 +14,8 @@ import java.time.LocalDateTime;
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class BerthCallsEntity {
@EqualsAndHashCode(callSuper = true)
public class BerthCallsEntity extends BaseEntity {
private Long id;

파일 보기

@ -1,12 +1,10 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.databind.JsonNode;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import jakarta.persistence.SequenceGenerator;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
@ -16,7 +14,8 @@ import java.time.LocalDateTime;
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class CurrentlyAtEntity {
@EqualsAndHashCode(callSuper = true)
public class CurrentlyAtEntity extends BaseEntity {
private String movementType;
private String imolRorIHSNumber;
private LocalDateTime movementDate;

파일 보기

@ -1,41 +0,0 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.databind.JsonNode;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
import java.time.LocalDateTime;
@Data
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class DarkActivityEntity {
private Long id;
private String movementType;
private String imolRorIHSNumber;
private LocalDateTime movementDate;
private Integer facilityId;
private String facilityName;
private String facilityType;
private Integer subFacilityId;
private String subFacilityName;
private String subFacilityType;
private String countryCode;
private String countryName;
private Double draught;
private Double latitude;
private Double longitude;
private JsonNode position;
private LocalDateTime eventStartDate;
}

파일 보기

@ -1,8 +1,10 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.databind.JsonNode;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
@ -12,7 +14,8 @@ import java.time.LocalDateTime;
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class DestinationEntity {
@EqualsAndHashCode(callSuper = true)
public class DestinationEntity extends BaseEntity {
private String movementType;
private String imolRorIHSNumber;
private LocalDateTime movementDate;

파일 보기

@ -1,12 +1,10 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.databind.JsonNode;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import jakarta.persistence.SequenceGenerator;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
@ -16,12 +14,9 @@ import java.time.LocalDateTime;
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class PortCallsEntity {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "ship_movement_id_seq")
@SequenceGenerator(name = "ship_movement_id_seq", sequenceName = "ship_movement_id_seq", allocationSize = 1)
@EqualsAndHashCode(callSuper = true)
public class PortCallsEntity extends BaseEntity {
private Long id;
private String movementType;
private String imolRorIHSNumber;
private LocalDateTime movementDate;

파일 보기

@ -1,8 +1,10 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.databind.JsonNode;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
@ -12,7 +14,8 @@ import java.time.LocalDateTime;
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class StsOperationEntity {
@EqualsAndHashCode(callSuper = true)
public class StsOperationEntity extends BaseEntity {
private Long id;

파일 보기

@ -2,8 +2,10 @@ package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.databind.JsonNode;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
@ -13,7 +15,8 @@ import java.time.LocalDateTime;
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class TerminalCallsEntity {
@EqualsAndHashCode(callSuper = true)
public class TerminalCallsEntity extends BaseEntity {
private Long id;

파일 보기

@ -1,7 +1,9 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
@ -11,7 +13,8 @@ import java.time.LocalDateTime;
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class TransitsEntity {
@EqualsAndHashCode(callSuper = true)
public class TransitsEntity extends BaseEntity {
private String movementType;
private String imolRorIHSNumber;
private LocalDateTime movementDate;

파일 보기

@ -6,33 +6,28 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.AnchorageCallsDto;
import com.snp.batch.jobs.movement.batch.entity.AnchorageCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class AnchorageCallsProcessor extends BaseProcessor<AnchorageCallsDto, AnchorageCallsEntity> {
private final ObjectMapper objectMapper;
public AnchorageCallsProcessor(ObjectMapper objectMapper) {
private final Long jobExecutionId;
public AnchorageCallsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper)
{
this.jobExecutionId = jobExecutionId;
this.objectMapper = objectMapper;
}
@Override
protected AnchorageCallsEntity processItem(AnchorageCallsDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
log.debug("AnchorageCalls 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
@ -60,6 +55,8 @@ public class AnchorageCallsProcessor extends BaseProcessor<AnchorageCallsDto, An
.destination(dto.getDestination())
.iso2(dto.getIso2())
.position(positionNode) // JsonNode로 매핑
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;

파일 보기

@ -6,33 +6,29 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.BerthCallsDto;
import com.snp.batch.jobs.movement.batch.entity.BerthCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class BerthCallsProcessor extends BaseProcessor<BerthCallsDto, BerthCallsEntity> {
private final ObjectMapper objectMapper;
private final Long jobExecutionId;
public BerthCallsProcessor(ObjectMapper objectMapper) {
public BerthCallsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper)
{
this.jobExecutionId = jobExecutionId;
this.objectMapper = objectMapper;
}
@Override
protected BerthCallsEntity processItem(BerthCallsDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
log.debug("BerthCalls 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
@ -60,6 +56,8 @@ public class BerthCallsProcessor extends BaseProcessor<BerthCallsDto, BerthCalls
.parentCallId(dto.getParentCallId())
.iso2(dto.getIso2())
.eventStartDate(LocalDateTime.parse(dto.getEventStartDate()))
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;

파일 보기

@ -6,33 +6,29 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.CurrentlyAtDto;
import com.snp.batch.jobs.movement.batch.entity.CurrentlyAtEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class CurrentlyAtProcessor extends BaseProcessor<CurrentlyAtDto, CurrentlyAtEntity> {
private final ObjectMapper objectMapper;
private final Long jobExecutionId;
public CurrentlyAtProcessor(ObjectMapper objectMapper) {
public CurrentlyAtProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper)
{
this.jobExecutionId = jobExecutionId;
this.objectMapper = objectMapper;
}
@Override
protected CurrentlyAtEntity processItem(CurrentlyAtDto dto) throws Exception {
log.debug("Currently 정보 처리 시작: imoNumber={}, facilityName={}",
log.debug("CurrentlyAt 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
@ -63,6 +59,8 @@ public class CurrentlyAtProcessor extends BaseProcessor<CurrentlyAtDto, Currentl
.destination(dto.getDestination())
.iso2(dto.getIso2())
.position(positionNode) // JsonNode로 매핑
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;

파일 보기

@ -1,66 +0,0 @@
package com.snp.batch.jobs.movement.batch.processor;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class DarkActivityProcessor extends BaseProcessor<DarkActivityDto, DarkActivityEntity> {
private final ObjectMapper objectMapper;
public DarkActivityProcessor(ObjectMapper objectMapper) {
this.objectMapper = objectMapper;
}
@Override
protected DarkActivityEntity processItem(DarkActivityDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
if (dto.getPosition() != null) {
// Position 객체를 JsonNode로 변환
positionNode = objectMapper.valueToTree(dto.getPosition());
}
DarkActivityEntity entity = DarkActivityEntity.builder()
.movementType(dto.getMovementType())
.imolRorIHSNumber(dto.getImolRorIHSNumber())
.movementDate(LocalDateTime.parse(dto.getMovementDate()))
.facilityId(dto.getFacilityId())
.facilityName(dto.getFacilityName())
.facilityType(dto.getFacilityType())
.subFacilityId(dto.getSubFacilityId())
.subFacilityName(dto.getSubFacilityName())
.subFacilityType(dto.getSubFacilityType())
.countryCode(dto.getCountryCode())
.countryName(dto.getCountryName())
.draught(dto.getDraught())
.latitude(dto.getLatitude())
.longitude(dto.getLongitude())
.position(positionNode) // JsonNode로 매핑
.eventStartDate(LocalDateTime.parse(dto.getEventStartDate()))
.build();
return entity;
}
}

파일 보기

@ -6,33 +6,28 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.DestinationDto;
import com.snp.batch.jobs.movement.batch.entity.DestinationEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class DestinationProcessor extends BaseProcessor<DestinationDto, DestinationEntity> {
private final ObjectMapper objectMapper;
public DestinationProcessor(ObjectMapper objectMapper) {
private final Long jobExecutionId;
public DestinationProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper)
{
this.jobExecutionId = jobExecutionId;
this.objectMapper = objectMapper;
}
@Override
protected DestinationEntity processItem(DestinationDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
log.debug("Destinations 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
@ -54,6 +49,8 @@ public class DestinationProcessor extends BaseProcessor<DestinationDto, Destinat
.longitude(dto.getLongitude())
.position(positionNode) // JsonNode로 매핑
.iso2(dto.getIso2())
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;
}

파일 보기

@ -6,33 +6,28 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.PortCallsDto;
import com.snp.batch.jobs.movement.batch.entity.PortCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class PortCallsProcessor extends BaseProcessor<PortCallsDto, PortCallsEntity> {
private final ObjectMapper objectMapper;
public PortCallsProcessor(ObjectMapper objectMapper) {
private final Long jobExecutionId;
public PortCallsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper)
{
this.jobExecutionId = jobExecutionId;
this.objectMapper = objectMapper;
}
@Override
protected PortCallsEntity processItem(PortCallsDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
log.debug("PortCalls 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
@ -63,7 +58,8 @@ public class PortCallsProcessor extends BaseProcessor<PortCallsDto, PortCallsEnt
.destination(dto.getDestination())
.iso2(dto.getIso2())
.position(positionNode) // JsonNode로 매핑
.schemaType("PORTCALL") // API 타입 구분
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;

파일 보기

@ -6,33 +6,28 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.StsOperationDto;
import com.snp.batch.jobs.movement.batch.entity.StsOperationEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class StsOperationProcessor extends BaseProcessor<StsOperationDto, StsOperationEntity> {
private final ObjectMapper objectMapper;
public StsOperationProcessor(ObjectMapper objectMapper) {
private final Long jobExecutionId;
public StsOperationProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper)
{
this.jobExecutionId = jobExecutionId;
this.objectMapper = objectMapper;
}
@Override
protected StsOperationEntity processItem(StsOperationDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
log.debug("StsOperations 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
@ -61,6 +56,8 @@ public class StsOperationProcessor extends BaseProcessor<StsOperationDto, StsOpe
.stsLocation(dto.getStsLocation())
.stsType(dto.getStsType())
.eventStartDate(LocalDateTime.parse(dto.getEventStartDate()))
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;

파일 보기

@ -6,33 +6,29 @@ import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.TerminalCallsDto;
import com.snp.batch.jobs.movement.batch.entity.TerminalCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class TerminalCallsProcessor extends BaseProcessor<TerminalCallsDto, TerminalCallsEntity> {
private final ObjectMapper objectMapper;
private final Long jobExecutionId;
public TerminalCallsProcessor(ObjectMapper objectMapper) {
public TerminalCallsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId,
ObjectMapper objectMapper)
{
this.jobExecutionId = jobExecutionId;
this.objectMapper = objectMapper;
}
@Override
protected TerminalCallsEntity processItem(TerminalCallsDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
log.debug("TerminalCalls 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
@ -63,6 +59,8 @@ public class TerminalCallsProcessor extends BaseProcessor<TerminalCallsDto, Term
.subFacilityId(dto.getSubFacilityId())
.subFacilityName(dto.getSubFacilityName())
.subFacilityType(dto.getSubFacilityType())
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;

파일 보기

@ -1,36 +1,30 @@
package com.snp.batch.jobs.movement.batch.processor;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.TransitsDto;
import com.snp.batch.jobs.movement.batch.entity.TransitsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class TransitsProcessor extends BaseProcessor<TransitsDto, TransitsEntity> {
// private final ObjectMapper objectMapper;
private final Long jobExecutionId;
// public TransitsProcessor(ObjectMapper objectMapper) {
// this.objectMapper = objectMapper;
// }
public TransitsProcessor(
@Value("#{stepExecution.jobExecution.id}") Long jobExecutionId)
{
this.jobExecutionId = jobExecutionId;
}
@Override
protected TransitsEntity processItem(TransitsDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
log.debug("Transits 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
TransitsEntity entity = TransitsEntity.builder()
@ -40,6 +34,8 @@ public class TransitsProcessor extends BaseProcessor<TransitsDto, TransitsEntity
.facilityName(dto.getFacilityName())
.facilityType(dto.getFacilityType())
.draught(dto.getDraught())
.jobExecutionId(jobExecutionId)
.createdBy("SYSTEM")
.build();
return entity;
}

파일 보기

@ -1,182 +0,0 @@
package com.snp.batch.jobs.movement.batch.reader;
import com.snp.batch.common.batch.reader.BaseApiReader;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.web.reactive.function.client.WebClient;
import java.time.LocalDate;
import java.time.format.DateTimeFormatter;
import java.util.List;
/**
* 선박 상세 정보 Reader (v2.0 - Chunk 기반)
*
* 기능:
* 1. ship_data 테이블에서 IMO 번호 전체 조회 (최초 1회)
* 2. IMO 번호를 100개씩 분할하여 배치 단위로 처리
* 3. fetchNextBatch() 호출 시마다 100개씩 API 호출
* 4. Spring Batch가 100건씩 Process Write 수행
*
* Chunk 처리 흐름:
* - beforeFetch() IMO 전체 조회 (1회)
* - fetchNextBatch() 100개 IMO로 API 호출 (1,718회)
* - read() 1건씩 반환 (100번)
* - Processor/Writer 100건 처리
* - 반복... (1,718번의 Chunk)
*
* 기존 방식과의 차이:
* - 기존: 17만건 전체 메모리 로드 Process Write
* - 신규: 100건씩 로드 Process Write (Chunk 1,718회)
*/
@Slf4j
@StepScope
public class DarkActivityRangeReader extends BaseApiReader<DarkActivityDto> {
private List<DarkActivityDto> allData;
// DB 해시값을 저장할
private int currentBatchIndex = 0;
private final int batchSize = 5000;
// @Value("#{jobParameters['startDate']}")
private String startDate;
// private String startDate = "2025-01-01";
// @Value("#{jobParameters['stopDate']}")
private String stopDate;
// private String stopDate = "2025-12-31";
/*public DarkActivityRangeReader(WebClient webClient) {
super(webClient);
enableChunkMode(); // Chunk 모드 활성화
}*/
public DarkActivityRangeReader(WebClient webClient,
@Value("#{jobParameters['startDate']}") String startDate,
@Value("#{jobParameters['stopDate']}") String stopDate) {
super(webClient);
// 날짜가 없으면 전날 하루 기준
if (startDate == null || startDate.isBlank() || stopDate == null || stopDate.isBlank()) {
LocalDate yesterday = LocalDate.now().minusDays(1);
this.startDate = yesterday.atStartOfDay().format(DateTimeFormatter.ISO_DATE_TIME) + "Z";
this.stopDate = yesterday.plusDays(1).atStartOfDay().format(DateTimeFormatter.ISO_DATE_TIME) + "Z";
} else {
this.startDate = startDate;
this.stopDate = stopDate;
}
enableChunkMode(); // Chunk 모드 활성화
}
@Override
protected String getReaderName() {
return "DarkActivityReader";
}
@Override
protected void resetCustomState() {
this.currentBatchIndex = 0;
this.allData = null;
}
@Override
protected String getApiPath() {
return "/Movements/DarkActivity";
}
@Override
protected String getApiBaseUrl() {
return "https://webservices.maritime.spglobal.com";
}
private static final String GET_ALL_IMO_QUERY =
"SELECT imo_number FROM ship_data ORDER BY id";
// "SELECT imo_number FROM snp_data.ship_data where imo_number > (select max(imo) from snp_data.t_darkactivity) ORDER BY imo_number";
/**
* 최초 1회만 실행: ship_data 테이블에서 IMO 번호 전체 조회
*/
@Override
protected void beforeFetch() {
log.info("[{}] 요청 날짜 범위: {} → {}", getReaderName(), startDate, stopDate);
}
/**
* Chunk 기반 핵심 메서드: 다음 100개 배치를 조회하여 반환
*
* Spring Batch가 100건씩 read() 호출 완료 메서드 재호출
*
* @return 다음 배치 100건 ( 이상 없으면 null)
*/
@Override
protected List<DarkActivityDto> fetchNextBatch() throws Exception {
// 모든 배치 처리 완료 확인
if (allData == null ) {
log.info("[{}] 최초 API 조회 실행: {} ~ {}", getReaderName(), startDate, stopDate);
allData = callApiWithBatch(startDate, stopDate);
if (allData == null || allData.isEmpty()) {
log.warn("[{}] 조회된 데이터 없음 → 종료", getReaderName());
return null;
}
log.info("[{}] 총 {}건 데이터 조회됨. batchSize = {}", getReaderName(), allData.size(), batchSize);
}
// 2) 이미 끝까지 읽었으면 종료
if (currentBatchIndex >= allData.size()) {
log.info("[{}] 모든 배치 처리 완료", getReaderName());
return null;
}
// 3) 이번 배치의 end 계산
int endIndex = Math.min(currentBatchIndex + batchSize, allData.size());
// 현재 배치의 IMO 번호 추출 (100개)
List<DarkActivityDto> batch = allData.subList(currentBatchIndex, endIndex);
int currentBatchNumber = (currentBatchIndex / batchSize) + 1;
int totalBatches = (int) Math.ceil((double) allData.size() / batchSize);
log.info("[{}] 배치 {}/{} 처리 중: {}건", getReaderName(), currentBatchNumber, totalBatches, batch.size());
currentBatchIndex = endIndex;
updateApiCallStats(totalBatches, currentBatchNumber);
return batch;
}
/**
* Query Parameter를 사용한 API 호출
*
* @param startDate,stopDate
* @return API 응답
*/
private List<DarkActivityDto> callApiWithBatch(String startDate, String stopDate){
String url = getApiPath() + "?startDate=" + startDate +"&stopDate="+stopDate;
// +"&lrno=" + lrno;
log.debug("[{}] API 호출: {}", getReaderName(), url);
return webClient.get()
.uri(url)
.retrieve()
.bodyToFlux(DarkActivityDto.class)
.collectList()
.block();
}
@Override
protected void afterFetch(List<DarkActivityDto> data) {
if (data == null) {
int totalBatches = (int) Math.ceil((double) allData.size() / batchSize);
log.info("[{}] 전체 {} 개 배치 처리 완료", getReaderName(), totalBatches);
/* log.info("[{}] 총 {} 개의 IMO 번호에 대한 API 호출 종료",
getReaderName(), allData.size());*/
}
}
}

파일 보기

@ -1,212 +0,0 @@
package com.snp.batch.jobs.movement.batch.reader;
import com.snp.batch.common.batch.reader.BaseApiReader;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.web.reactive.function.client.WebClient;
import java.util.Collections;
import java.util.List;
import java.util.Map;
/**
* 선박 상세 정보 Reader (v2.0 - Chunk 기반)
*
* 기능:
* 1. ship_data 테이블에서 IMO 번호 전체 조회 (최초 1회)
* 2. IMO 번호를 100개씩 분할하여 배치 단위로 처리
* 3. fetchNextBatch() 호출 시마다 100개씩 API 호출
* 4. Spring Batch가 100건씩 Process Write 수행
*
* Chunk 처리 흐름:
* - beforeFetch() IMO 전체 조회 (1회)
* - fetchNextBatch() 100개 IMO로 API 호출 (1,718회)
* - read() 1건씩 반환 (100번)
* - Processor/Writer 100건 처리
* - 반복... (1,718번의 Chunk)
*
* 기존 방식과의 차이:
* - 기존: 17만건 전체 메모리 로드 Process Write
* - 신규: 100건씩 로드 Process Write (Chunk 1,718회)
*/
@Slf4j
@StepScope
public class DarkActivityReader extends BaseApiReader<DarkActivityDto> {
private final JdbcTemplate jdbcTemplate;
// 배치 처리 상태
private List<String> allImoNumbers;
// DB 해시값을 저장할
private Map<String, String> dbMasterHashes;
private int currentBatchIndex = 0;
private final int batchSize = 5;
// @Value("#{jobParameters['startDate']}")
// private String startDate;
private String startDate = "2025-01-01";
// @Value("#{jobParameters['stopDate']}")
// private String stopDate;
private String stopDate = "2025-12-31";
public DarkActivityReader(WebClient webClient, JdbcTemplate jdbcTemplate ) {
super(webClient);
this.jdbcTemplate = jdbcTemplate;
enableChunkMode(); // Chunk 모드 활성화
}
@Override
protected String getReaderName() {
return "DarkActivityReader";
}
@Override
protected void resetCustomState() {
this.currentBatchIndex = 0;
this.allImoNumbers = null;
this.dbMasterHashes = null;
}
@Override
protected String getApiPath() {
return "/Movements/DarkActivity";
}
@Override
protected String getApiBaseUrl() {
return "https://webservices.maritime.spglobal.com";
}
private static final String GET_ALL_IMO_QUERY =
"SELECT imo_number FROM ship_data ORDER BY id";
// "SELECT imo_number FROM snp_data.ship_data where imo_number > (select max(imo) from snp_data.t_darkactivity) ORDER BY imo_number";
/**
* 최초 1회만 실행: ship_data 테이블에서 IMO 번호 전체 조회
*/
@Override
protected void beforeFetch() {
// 전처리 과정
// Step 1. IMO 전체 번호 조회
log.info("[{}] ship_data 테이블에서 IMO 번호 조회 시작...", getReaderName());
allImoNumbers = jdbcTemplate.queryForList(GET_ALL_IMO_QUERY, String.class);
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);
log.info("[{}] 총 {} 개의 IMO 번호 조회 완료", getReaderName(), allImoNumbers.size());
log.info("[{}] {}개씩 배치로 분할하여 API 호출 예정", getReaderName(), batchSize);
log.info("[{}] 예상 배치 수: {} 개", getReaderName(), totalBatches);
// API 통계 초기화
updateApiCallStats(totalBatches, 0);
}
/**
* Chunk 기반 핵심 메서드: 다음 100개 배치를 조회하여 반환
*
* Spring Batch가 100건씩 read() 호출 완료 메서드 재호출
*
* @return 다음 배치 100건 ( 이상 없으면 null)
*/
@Override
protected List<DarkActivityDto> fetchNextBatch() throws Exception {
// 모든 배치 처리 완료 확인
if (allImoNumbers == null || currentBatchIndex >= allImoNumbers.size()) {
return null; // Job 종료
}
// 현재 배치의 시작/ 인덱스 계산
int startIndex = currentBatchIndex;
int endIndex = Math.min(currentBatchIndex + batchSize, allImoNumbers.size());
// 현재 배치의 IMO 번호 추출 (100개)
List<String> currentBatch = allImoNumbers.subList(startIndex, endIndex);
int currentBatchNumber = (currentBatchIndex / batchSize) + 1;
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);
log.info("[{}] 배치 {}/{} 처리 중 (IMO {} 개)...",
getReaderName(), currentBatchNumber, totalBatches, currentBatch.size());
try {
// IMO 번호를 쉼표로 연결 (: "1000019,1000021,1000033,...")
String imoParam = String.join(",", currentBatch);
// API 호출
List<DarkActivityDto> response = callApiWithBatch(imoParam);
// 다음 배치로 인덱스 이동
currentBatchIndex = endIndex;
// 응답 처리
if (response != null ) {
List<DarkActivityDto> darkActivityList = response;
log.info("[{}] 배치 {}/{} 완료: {} 건 조회",
getReaderName(), currentBatchNumber, totalBatches, darkActivityList.size());
// API 호출 통계 업데이트
updateApiCallStats(totalBatches, currentBatchNumber);
// API 과부하 방지 (다음 배치 0.5초 대기)
if (currentBatchIndex < allImoNumbers.size()) {
Thread.sleep(500);
}
return darkActivityList;
} else {
log.warn("[{}] 배치 {}/{} 응답 없음",
getReaderName(), currentBatchNumber, totalBatches);
// API 호출 통계 업데이트 (실패도 카운트)
updateApiCallStats(totalBatches, currentBatchNumber);
return Collections.emptyList();
}
} catch (Exception e) {
log.error("[{}] 배치 {}/{} 처리 중 오류: {}",
getReaderName(), currentBatchNumber, totalBatches, e.getMessage(), e);
// 오류 발생 시에도 다음 배치로 이동 (부분 실패 허용)
currentBatchIndex = endIndex;
// 리스트 반환 (Job 계속 진행)
return Collections.emptyList();
}
}
/**
* Query Parameter를 사용한 API 호출
*
* @param lrno 쉼표로 연결된 IMO 번호 (: "1000019,1000021,...")
* @return API 응답
*/
private List<DarkActivityDto> callApiWithBatch(String lrno) {
String url = getApiPath() + "?startDate=" + startDate +"&stopDate="+stopDate+"&lrno=" + lrno;
log.debug("[{}] API 호출: {}", getReaderName(), url);
return webClient.get()
.uri(url)
.retrieve()
.bodyToFlux(DarkActivityDto.class)
.collectList()
.block();
}
@Override
protected void afterFetch(List<DarkActivityDto> data) {
if (data == null) {
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);
log.info("[{}] 전체 {} 개 배치 처리 완료", getReaderName(), totalBatches);
log.info("[{}] 총 {} 개의 IMO 번호에 대한 API 호출 종료",
getReaderName(), allImoNumbers.size());
}
}
}

파일 보기

@ -1,37 +1,43 @@
package com.snp.batch.jobs.movement.batch.repository;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.AnchorageCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.sql.Types;
import java.util.List;
/**
* 선박 상세 정보 Repository 구현체
* BaseJdbcRepository를 상속하여 JDBC 기반 CRUD 구현
*/
@Slf4j
@Repository("anchorageCallsRepository")
public class AnchorageCallsRepositoryImpl extends BaseJdbcRepository<AnchorageCallsEntity, String>
implements AnchorageCallsRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-001}")
private String tableName;
public AnchorageCallsRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
// return "snp_data.t_anchoragecall";
return "new_snp.t_anchoragecall";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -46,49 +52,29 @@ public class AnchorageCallsRepositoryImpl extends BaseJdbcRepository<AnchorageCa
@Override
public String getInsertSql() {
/*return """
INSERT INTO snp_data.t_anchoragecall(*/
return """
INSERT INTO new_snp.t_anchoragecall(
imo,
INSERT INTO %s(
imo_no,
mvmn_type,
mvmn_dt,
stpov_id,
fclty_id,
fclty_nm,
fclty_type,
lwrnk_fclty_id,
lwrnk_fclty_nm,
lwrnk_fclty_type,
ntn_cd,
ntn_nm,
prtcll_id,
facility_id,
facility_nm,
facility_type,
lwrnk_facility_id,
lwrnk_facility_desc,
lwrnk_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
dstn,
iso2_ntn_cd,
lcinfo
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (imo,mvmn_type, mvmn_dt)
DO UPDATE SET
mvmn_type = EXCLUDED.mvmn_type,
mvmn_dt = EXCLUDED.mvmn_dt,
stpov_id = EXCLUDED.stpov_id,
fclty_id = EXCLUDED.fclty_id,
fclty_nm = EXCLUDED.fclty_nm,
fclty_type = EXCLUDED.fclty_type,
lwrnk_fclty_id = EXCLUDED.lwrnk_fclty_id,
lwrnk_fclty_nm = EXCLUDED.lwrnk_fclty_nm,
lwrnk_fclty_type = EXCLUDED.lwrnk_fclty_type,
ntn_cd = EXCLUDED.ntn_cd,
ntn_nm = EXCLUDED.ntn_nm,
draft = EXCLUDED.draft,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
dstn = EXCLUDED.dstn,
iso2_ntn_cd = EXCLUDED.iso2_ntn_cd,
lcinfo = EXCLUDED.lcinfo
""";
dest,
iso_two_country_cd,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}
@Override
@ -122,8 +108,8 @@ public class AnchorageCallsRepositoryImpl extends BaseJdbcRepository<AnchorageCa
} else {
ps.setNull(i++, java.sql.Types.OTHER);
}
// ps.setString(i++, e.getSchemaType());
ps.setObject(i++, e.getJobExecutionId(), Types.INTEGER);
ps.setString(i++, e.getCreatedBy());
}
@ -150,53 +136,7 @@ public class AnchorageCallsRepositoryImpl extends BaseJdbcRepository<AnchorageCa
public void saveAll(List<AnchorageCallsEntity> entities) {
if (entities == null || entities.isEmpty()) return;
// log.info("ShipMovement 저장 시작 = {}건", entities.size());
batchInsert(entities);
}
/**
* ShipDetailEntity RowMapper
*/
private static class AnchorageCallsRowMapper implements RowMapper<AnchorageCallsEntity> {
@Override
public AnchorageCallsEntity mapRow(ResultSet rs, int rowNum) throws SQLException {
AnchorageCallsEntity entity = AnchorageCallsEntity.builder()
.id(rs.getLong("id"))
.imolRorIHSNumber(rs.getString("imolRorIHSNumber"))
.portCallId(rs.getObject("portCallId", Integer.class))
.facilityId(rs.getObject("facilityId", Integer.class))
.facilityName(rs.getString("facilityName"))
.facilityType(rs.getString("facilityType"))
.subFacilityId(rs.getObject("subFacilityId", Integer.class))
.subFacilityName(rs.getString("subFacilityName"))
.subFacilityType(rs.getString("subFacilityType"))
.countryCode(rs.getString("countryCode"))
.countryName(rs.getString("countryName"))
.draught(rs.getObject("draught", Double.class))
.latitude(rs.getObject("latitude", Double.class))
.longitude(rs.getObject("longitude", Double.class))
.destination(rs.getString("destination"))
.iso2(rs.getString("iso2"))
.position(parseJson(rs.getString("position")))
.build();
Timestamp movementDate = rs.getTimestamp("movementDate");
if (movementDate != null) {
entity.setMovementDate(movementDate.toLocalDateTime());
}
return entity;
}
private JsonNode parseJson(String json) {
try {
if (json == null) return null;
return new ObjectMapper().readTree(json);
} catch (Exception e) {
throw new RuntimeException("JSON 파싱 오류: " + json);
}
}
}
}

파일 보기

@ -1,37 +1,43 @@
package com.snp.batch.jobs.movement.batch.repository;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.BerthCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.sql.Types;
import java.util.List;
/**
* 선박 상세 정보 Repository 구현체
* BaseJdbcRepository를 상속하여 JDBC 기반 CRUD 구현
*/
@Slf4j
@Repository("BerthCallsRepository")
public class BerthCallsRepositoryImpl extends BaseJdbcRepository<BerthCallsEntity, String>
implements BerthCallsRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-002}")
private String tableName;
public BerthCallsRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
// return "snp_data.t_berthcall";
return "new_snp.t_berthcall";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -46,49 +52,29 @@ public class BerthCallsRepositoryImpl extends BaseJdbcRepository<BerthCallsEntit
@Override
public String getInsertSql() {
/*return """
INSERT INTO snp_data.t_berthcall(*/
return """
INSERT INTO new_snp.t_berthcall(
imo,
INSERT INTO %s(
imo_no,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
ntn_cd,
ntn_nm,
facility_id,
facility_nm,
facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
prnt_call_id,
iso2_ntn_cd,
evt_start_dt,
lcinfo
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (imo, mvmn_type, mvmn_dt)
DO UPDATE SET
mvmn_type = EXCLUDED.mvmn_type,
mvmn_dt = EXCLUDED.mvmn_dt,
fclty_id = EXCLUDED.fclty_id,
fclty_nm = EXCLUDED.fclty_nm,
fclty_type = EXCLUDED.fclty_type,
up_fclty_id = EXCLUDED.up_fclty_id,
up_fclty_nm = EXCLUDED.up_fclty_nm,
up_fclty_type = EXCLUDED.up_fclty_type,
ntn_cd = EXCLUDED.ntn_cd,
ntn_nm = EXCLUDED.ntn_nm,
draft = EXCLUDED.draft,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
prnt_call_id = EXCLUDED.prnt_call_id,
iso2_ntn_cd = EXCLUDED.iso2_ntn_cd,
evt_start_dt = EXCLUDED.evt_start_dt,
lcinfo = EXCLUDED.lcinfo
""";
up_clot_id,
iso_two_country_cd,
event_sta_dt,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}
@Override
@ -122,6 +108,8 @@ public class BerthCallsRepositoryImpl extends BaseJdbcRepository<BerthCallsEntit
} else {
ps.setNull(i++, java.sql.Types.OTHER);
}
ps.setObject(i++, e.getJobExecutionId(), Types.INTEGER);
ps.setString(i++, e.getCreatedBy());
}
private void setDoubleOrNull(PreparedStatement ps, int index, Double value) throws Exception {
@ -147,47 +135,7 @@ public class BerthCallsRepositoryImpl extends BaseJdbcRepository<BerthCallsEntit
public void saveAll(List<BerthCallsEntity> entities) {
if (entities == null || entities.isEmpty()) return;
// log.info("BerthCalls 저장 시작 = {}건", entities.size());
batchInsert(entities);
}
/**
* ShipDetailEntity RowMapper
*/
private static class BerthCallsRowMapper implements RowMapper<BerthCallsEntity> {
@Override
public BerthCallsEntity mapRow(ResultSet rs, int rowNum) throws SQLException {
BerthCallsEntity entity = BerthCallsEntity.builder()
.id(rs.getLong("id"))
.imolRorIHSNumber(rs.getString("imolRorIHSNumber"))
.facilityId(rs.getObject("facilityId", Integer.class))
.facilityName(rs.getString("facilityName"))
.facilityType(rs.getString("facilityType"))
.countryCode(rs.getString("countryCode"))
.countryName(rs.getString("countryName"))
.draught(rs.getObject("draught", Double.class))
.latitude(rs.getObject("latitude", Double.class))
.longitude(rs.getObject("longitude", Double.class))
.position(parseJson(rs.getString("position")))
.build();
Timestamp movementDate = rs.getTimestamp("movementDate");
if (movementDate != null) {
entity.setMovementDate(movementDate.toLocalDateTime());
}
return entity;
}
private JsonNode parseJson(String json) {
try {
if (json == null) return null;
return new ObjectMapper().readTree(json);
} catch (Exception e) {
throw new RuntimeException("JSON 파싱 오류: " + json);
}
}
}
}

파일 보기

@ -4,31 +4,40 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.CurrentlyAtEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.Timestamp;
import java.sql.Types;
import java.util.List;
/**
* 선박 상세 정보 Repository 구현체
* BaseJdbcRepository를 상속하여 JDBC 기반 CRUD 구현
*/
@Slf4j
@Repository("CurrentlyAtRepository")
public class CurrentlyAtRepositoryImpl extends BaseJdbcRepository<CurrentlyAtEntity, String>
implements CurrentlyAtRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-003}")
private String tableName;
public CurrentlyAtRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
// return "snp_data.t_currentlyat";
return "new_snp.t_currentlyat";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -43,55 +52,32 @@ public class CurrentlyAtRepositoryImpl extends BaseJdbcRepository<CurrentlyAtEnt
@Override
public String getInsertSql() {
/*return """
INSERT INTO snp_data.t_currentlyat(*/
return """
INSERT INTO new_snp.t_currentlyat(
imo,
INSERT INTO %s(
imo_no,
mvmn_type,
mvmn_dt,
stpov_id,
fclty_id,
fclty_nm,
fclty_type,
lwrnk_fclty_id,
lwrnk_fclty_nm,
lwrnk_fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
ntn_cd,
ntn_nm,
prtcll_id,
facility_id,
facility_nm,
facility_type,
lwrnk_facility_id,
lwrnk_facility_desc,
lwrnk_facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
dstn,
iso2_ntn_cd,
lcinfo
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (imo, mvmn_type, mvmn_dt)
DO UPDATE SET
mvmn_type = EXCLUDED.mvmn_type,
mvmn_dt = EXCLUDED.mvmn_dt,
stpov_id = EXCLUDED.stpov_id,
fclty_id = EXCLUDED.fclty_id,
fclty_nm = EXCLUDED.fclty_nm,
fclty_type = EXCLUDED.fclty_type,
lwrnk_fclty_id = EXCLUDED.lwrnk_fclty_id,
lwrnk_fclty_nm = EXCLUDED.lwrnk_fclty_nm,
lwrnk_fclty_type = EXCLUDED.lwrnk_fclty_type,
up_fclty_id = EXCLUDED.up_fclty_id,
up_fclty_nm = EXCLUDED.up_fclty_nm,
up_fclty_type = EXCLUDED.up_fclty_type,
ntn_cd = EXCLUDED.ntn_cd,
ntn_nm = EXCLUDED.ntn_nm,
draft = EXCLUDED.draft,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
dstn = EXCLUDED.dstn,
iso2_ntn_cd = EXCLUDED.iso2_ntn_cd,
lcinfo = EXCLUDED.lcinfo
""";
dest,
country_iso_two_cd,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}
@Override
@ -128,8 +114,8 @@ public class CurrentlyAtRepositoryImpl extends BaseJdbcRepository<CurrentlyAtEnt
} else {
ps.setNull(i++, java.sql.Types.OTHER);
}
// ps.setString(i++, e.getSchemaType());
ps.setObject(i++, e.getJobExecutionId(), Types.INTEGER);
ps.setString(i++, e.getCreatedBy());
}
@ -156,7 +142,6 @@ public class CurrentlyAtRepositoryImpl extends BaseJdbcRepository<CurrentlyAtEnt
public void saveAll(List<CurrentlyAtEntity> entities) {
if (entities == null || entities.isEmpty()) return;
// log.info("CurrentltAt 저장 시작 = {}건", entities.size());
batchInsert(entities);
}

파일 보기

@ -1,13 +0,0 @@
package com.snp.batch.jobs.movement.batch.repository;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import java.util.List;
/**
* 선박 상세 정보 Repository 인터페이스
*/
public interface DarkActivityRepository {
void saveAll(List<DarkActivityEntity> entities);
}

파일 보기

@ -1,187 +0,0 @@
package com.snp.batch.jobs.movement.batch.repository;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.List;
/**
* 선박 상세 정보 Repository 구현체
* BaseJdbcRepository를 상속하여 JDBC 기반 CRUD 구현
*/
@Slf4j
@Repository("")
public class DarkActivityRepositoryImpl extends BaseJdbcRepository<DarkActivityEntity, String>
implements DarkActivityRepository {
public DarkActivityRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "new_snp.t_darkactivity";
// return "snp_data.t_darkactivity";
}
@Override
protected String getEntityName() {
return "DarkActivity";
}
@Override
protected String extractId(DarkActivityEntity entity) {
return entity.getImolRorIHSNumber();
}
@Override
public String getInsertSql() {
// return """
// INSERT INTO snp_data.t_darkactivity(
return """
INSERT INTO new_snp.t_darkactivity(
imo,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
lwrnk_fclty_id,
lwrnk_fclty_nm,
lwrnk_fclty_type,
ntn_cd,
ntn_nm,
draft,
lat,
lon,
evt_start_dt,
lcinfo
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (imo, mvmn_type, mvmn_dt, fclty_id)
DO UPDATE SET
mvmn_type = EXCLUDED.mvmn_type,
mvmn_dt = EXCLUDED.mvmn_dt,
fclty_id = EXCLUDED.fclty_id,
fclty_nm = EXCLUDED.fclty_nm,
fclty_type = EXCLUDED.fclty_type,
lwrnk_fclty_id = EXCLUDED.lwrnk_fclty_id,
lwrnk_fclty_nm = EXCLUDED.lwrnk_fclty_nm,
lwrnk_fclty_type = EXCLUDED.lwrnk_fclty_type,
ntn_cd = EXCLUDED.ntn_cd,
ntn_nm = EXCLUDED.ntn_nm,
draft = EXCLUDED.draft,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
evt_start_dt = EXCLUDED.evt_start_dt,
lcinfo = EXCLUDED.lcinfo
""";
}
@Override
protected String getUpdateSql() {
return null;
}
@Override
protected void setInsertParameters(PreparedStatement ps, DarkActivityEntity e) throws Exception {
int i = 1;
ps.setString(i++, e.getImolRorIHSNumber()); // imo
ps.setString(i++, e.getMovementType()); // mvmn_type
ps.setTimestamp(i++, e.getMovementDate() != null ? Timestamp.valueOf(e.getMovementDate()) : null); // mvmn_dt
ps.setObject(i++, e.getFacilityId()); // fclty_id
ps.setString(i++, e.getFacilityName()); // fclty_nm
ps.setString(i++, e.getFacilityType()); // fclty_type
ps.setObject(i++, e.getSubFacilityId()); //lwrnk_fclty_id
ps.setString(i++, e.getSubFacilityName()); // lwrnk_fclty_nm
ps.setString(i++, e.getSubFacilityType()); //lwrnk_fclty_type
ps.setString(i++, e.getCountryCode()); // ntn_cd
ps.setString(i++, e.getCountryName()); // ntn_nm
setDoubleOrNull(ps, i++, e.getDraught()); // draft
setDoubleOrNull(ps, i++, e.getLatitude()); // lat
setDoubleOrNull(ps, i++, e.getLongitude());// lon
ps.setTimestamp(i++, e.getEventStartDate() != null ? Timestamp.valueOf(e.getEventStartDate()) : null); // evt_start_dt
if (e.getPosition() != null) {
ps.setObject(i++, OBJECT_MAPPER.writeValueAsString(e.getPosition()), java.sql.Types.OTHER); // lcinfo (jsonb)
} else {
ps.setNull(i++, java.sql.Types.OTHER);
}
}
private void setDoubleOrNull(PreparedStatement ps, int index, Double value) throws Exception {
if (value != null) {
ps.setDouble(index, value);
} else {
// java.sql.Types.DOUBLE을 사용하여 명시적으로 SQL NULL을 설정
ps.setNull(index, java.sql.Types.DOUBLE);
}
}
@Override
protected void setUpdateParameters(PreparedStatement ps, DarkActivityEntity entity) throws Exception {
}
@Override
protected RowMapper<DarkActivityEntity> getRowMapper() {
return null;
}
@Override
public void saveAll(List<DarkActivityEntity> entities) {
if (entities == null || entities.isEmpty()) return;
log.info("DarkActivity 저장 시작 = {}건", entities.size());
batchInsert(entities);
}
/**
* ShipDetailEntity RowMapper
*/
private static class DarkActivityRowMapper implements RowMapper<DarkActivityEntity> {
@Override
public DarkActivityEntity mapRow(ResultSet rs, int rowNum) throws SQLException {
DarkActivityEntity entity = DarkActivityEntity.builder()
.id(rs.getLong("id"))
.imolRorIHSNumber(rs.getString("imolRorIHSNumber"))
.facilityId(rs.getObject("facilityId", Integer.class))
.facilityName(rs.getString("facilityName"))
.facilityType(rs.getString("facilityType"))
.countryCode(rs.getString("countryCode"))
.countryName(rs.getString("countryName"))
.draught(rs.getObject("draught", Double.class))
.latitude(rs.getObject("latitude", Double.class))
.longitude(rs.getObject("longitude", Double.class))
.position(parseJson(rs.getString("position")))
.build();
Timestamp movementDate = rs.getTimestamp("movementDate");
if (movementDate != null) {
entity.setMovementDate(movementDate.toLocalDateTime());
}
return entity;
}
private JsonNode parseJson(String json) {
try {
if (json == null) return null;
return new ObjectMapper().readTree(json);
} catch (Exception e) {
throw new RuntimeException("JSON 파싱 오류: " + json);
}
}
}
}

파일 보기

@ -4,31 +4,40 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.DestinationEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.Timestamp;
import java.sql.Types;
import java.util.List;
/**
* 선박 상세 정보 Repository 구현체
* BaseJdbcRepository를 상속하여 JDBC 기반 CRUD 구현
*/
@Slf4j
@Repository("DestinationRepository")
public class DestinationRepositoryImpl extends BaseJdbcRepository<DestinationEntity, String>
implements DestinationRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-004}")
private String tableName;
public DestinationRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
// return "snp_data.t_destination";
return "new_snp.t_destination";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -43,37 +52,23 @@ public class DestinationRepositoryImpl extends BaseJdbcRepository<DestinationEnt
@Override
public String getInsertSql() {
/*return """
INSERT INTO snp_data.t_destination(*/
return """
INSERT INTO new_snp.t_destination(
imo,
INSERT INTO %s(
imo_no,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
ntn_cd,
ntn_nm,
facility_id,
facility_nm,
facility_type,
country_cd,
country_nm,
lat,
lon,
iso2_ntn_cd,
lcinfo
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (imo)
DO UPDATE SET
mvmn_type = EXCLUDED.mvmn_type,
mvmn_dt = EXCLUDED.mvmn_dt,
fclty_id = EXCLUDED.fclty_id,
fclty_nm = EXCLUDED.fclty_nm,
fclty_type = EXCLUDED.fclty_type,
ntn_cd = EXCLUDED.ntn_cd,
ntn_nm = EXCLUDED.ntn_nm,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
iso2_ntn_cd = EXCLUDED.iso2_ntn_cd,
lcinfo = EXCLUDED.lcinfo
""";
country_iso_two_cd,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}
@Override
@ -101,6 +96,8 @@ public class DestinationRepositoryImpl extends BaseJdbcRepository<DestinationEnt
} else {
ps.setNull(i++, java.sql.Types.OTHER);
}
ps.setObject(i++, e.getJobExecutionId(), Types.INTEGER);
ps.setString(i++, e.getCreatedBy());
}
private void setDoubleOrNull(PreparedStatement ps, int index, Double value) throws Exception {

파일 보기

@ -4,13 +4,9 @@ import com.snp.batch.jobs.movement.batch.entity.PortCallsEntity;
import java.util.List;
/**
* 선박 상세 정보 Repository 인터페이스
*/
public interface PortCallsRepository {
void saveAll(List<PortCallsEntity> entities);
boolean existsByPortCallId(Integer portCallId);
}

파일 보기

@ -1,37 +1,43 @@
package com.snp.batch.jobs.movement.batch.repository;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.PortCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.sql.Types;
import java.util.List;
/**
* 선박 상세 정보 Repository 구현체
* BaseJdbcRepository를 상속하여 JDBC 기반 CRUD 구현
*/
@Slf4j
@Repository("ShipMovementRepository")
public class PortCallsRepositoryImpl extends BaseJdbcRepository<PortCallsEntity, String>
implements PortCallsRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-005}")
private String tableName;
public PortCallsRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
// return "snp_data.t_ship_stpov_info";
return "new_snp.t_ship_stpov_info";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -46,86 +52,37 @@ public class PortCallsRepositoryImpl extends BaseJdbcRepository<PortCallsEntity,
@Override
public String getInsertSql() {
// return """
// INSERT INTO snp_data.t_ship_stpov_info(
return """
INSERT INTO new_snp.t_ship_stpov_info(
imo,
INSERT INTO %s(
imo_no,
mvmn_type,
mvmn_dt,
stpov_id,
fclty_id,
fclty_nm,
fclty_type,
lwrnk_fclty_id,
lwrnk_fclty_nm,
lwrnk_fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
ntn_cd,
ntn_nm,
prtcll_id,
facility_id,
facility_nm,
facility_type,
lwrnk_facility_id,
lwrnk_facility_desc,
lwrnk_facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
dstn,
iso2_ntn_cd,
lcinfo
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (imo, mvmn_type, mvmn_dt)
DO UPDATE SET
mvmn_type = EXCLUDED.mvmn_type,
mvmn_dt = EXCLUDED.mvmn_dt,
stpov_id = EXCLUDED.stpov_id,
fclty_id = EXCLUDED.fclty_id,
fclty_nm = EXCLUDED.fclty_nm,
fclty_type = EXCLUDED.fclty_type,
lwrnk_fclty_id = EXCLUDED.lwrnk_fclty_id,
lwrnk_fclty_nm = EXCLUDED.lwrnk_fclty_nm,
lwrnk_fclty_type = EXCLUDED.lwrnk_fclty_type,
up_fclty_id = EXCLUDED.up_fclty_id,
up_fclty_nm = EXCLUDED.up_fclty_nm,
up_fclty_type = EXCLUDED.up_fclty_type,
ntn_cd = EXCLUDED.ntn_cd,
draft = EXCLUDED.draft,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
dstn = EXCLUDED.dstn,
iso2_ntn_cd = EXCLUDED.iso2_ntn_cd,
lcinfo = EXCLUDED.lcinfo
""";
dest,
country_iso_two_cd,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}
@Override
protected String getUpdateSql() {
return """
UPDATE snp_data.t_ship_stpov_info
SET vesselid = ?,
maritimemobileserviceidentitymmsinumber = ?,
shipname = ?,
callsign = ?,
flagname = ?,
portofregistry = ?,
classificationsociety = ?,
shiptypelevel5 = ?,
shiptypelevel5subtype = ?,
yearofbuild = ?,
shipbuilder = ?,
lengthoverallloa = ?,
breadthmoulded = ?,
"depth" = ?,
draught = ?,
grosstonnage = ?,
deadweight = ?,
teu = ?,
speedservice = ?,
mainenginetype = ?,
status = ?,
operator = ?,
flagcode = ?,
shiptypelevel2 = ?
WHERE ihslrorimoshipno = ?
""";
return null;
}
@Override
@ -135,7 +92,6 @@ public class PortCallsRepositoryImpl extends BaseJdbcRepository<PortCallsEntity,
ps.setString(i++, e.getMovementType()); // mvmn_type
ps.setTimestamp(i++, e.getMovementDate() != null ? Timestamp.valueOf(e.getMovementDate()) : null); // mvmn_dt
ps.setObject(i++, e.getPortCallId()); // stpov_id
// stpov_type는 'PORTCALL' 하드코딩되었으므로 세팅 안함
ps.setObject(i++, e.getFacilityId()); // fclty_id
ps.setString(i++, e.getFacilityName()); // fclty_nm
ps.setString(i++, e.getFacilityType()); // fclty_type
@ -158,8 +114,8 @@ public class PortCallsRepositoryImpl extends BaseJdbcRepository<PortCallsEntity,
} else {
ps.setNull(i++, java.sql.Types.OTHER);
}
// ps.setString(i++, e.getSchemaType());
ps.setObject(i++, e.getJobExecutionId(), Types.INTEGER);
ps.setString(i++, e.getCreatedBy());
}
@ -179,7 +135,7 @@ public class PortCallsRepositoryImpl extends BaseJdbcRepository<PortCallsEntity,
@Override
protected RowMapper<PortCallsEntity> getRowMapper() {
return new ShipMovementRowMapper();
return null;
}
@Override
@ -190,65 +146,4 @@ public class PortCallsRepositoryImpl extends BaseJdbcRepository<PortCallsEntity,
batchInsert(entities);
}
@Override
public boolean existsByPortCallId(Integer portCallId) {
String sql = """
SELECT COUNT(1)
FROM ship_movement
WHERE portCallId = ?
""";
Integer count = jdbcTemplate.queryForObject(sql, Integer.class, portCallId);
return count != null && count > 0;
}
/**
* ShipDetailEntity RowMapper
*/
private static class ShipMovementRowMapper implements RowMapper<PortCallsEntity> {
@Override
public PortCallsEntity mapRow(ResultSet rs, int rowNum) throws SQLException {
PortCallsEntity entity = PortCallsEntity.builder()
.id(rs.getLong("id"))
.imolRorIHSNumber(rs.getString("imolRorIHSNumber"))
.portCallId(rs.getObject("portCallId", Integer.class))
.facilityId(rs.getObject("facilityId", Integer.class))
.facilityName(rs.getString("facilityName"))
.facilityType(rs.getString("facilityType"))
.subFacilityId(rs.getObject("subFacilityId", Integer.class))
.subFacilityName(rs.getString("subFacilityName"))
.subFacilityType(rs.getString("subFacilityType"))
.parentFacilityId(rs.getObject("parentFacilityId", Integer.class))
.parentFacilityName(rs.getString("parentFacilityName"))
.parentFacilityType(rs.getString("parentFacilityType"))
.countryCode(rs.getString("countryCode"))
.countryName(rs.getString("countryName"))
.draught(rs.getObject("draught", Double.class))
.latitude(rs.getObject("latitude", Double.class))
.longitude(rs.getObject("longitude", Double.class))
.destination(rs.getString("destination"))
.iso2(rs.getString("iso2"))
.position(parseJson(rs.getString("position")))
.schemaType(rs.getString("schemaType"))
.build();
Timestamp movementDate = rs.getTimestamp("movementDate");
if (movementDate != null) {
entity.setMovementDate(movementDate.toLocalDateTime());
}
return entity;
}
private JsonNode parseJson(String json) {
try {
if (json == null) return null;
return new com.fasterxml.jackson.databind.ObjectMapper().readTree(json);
} catch (Exception e) {
throw new RuntimeException("JSON 파싱 오류: " + json);
}
}
}
}

파일 보기

@ -4,31 +4,40 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.StsOperationEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.Timestamp;
import java.sql.Types;
import java.util.List;
/**
* 선박 상세 정보 Repository 구현체
* BaseJdbcRepository를 상속하여 JDBC 기반 CRUD 구현
*/
@Slf4j
@Repository("StsOperationRepository")
public class StsOperationRepositoryImpl extends BaseJdbcRepository<StsOperationEntity, String>
implements StsOperationRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-006}")
private String tableName;
public StsOperationRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
// return "snp_data.t_stsoperation";
return "new_snp.t_stsoperation";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -43,51 +52,30 @@ public class StsOperationRepositoryImpl extends BaseJdbcRepository<StsOperationE
@Override
public String getInsertSql() {
// return """
// INSERT INTO snp_data.t_stsoperation(
return """
INSERT INTO new_snp.t_stsoperation(
imo,
INSERT INTO %s(
imo_no,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
facility_id,
facility_nm,
facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
draft,
lat,
lon,
prnt_call_id,
ntn_cd,
ntn_nm,
sts_location,
up_prtcll_id,
country_cd,
country_nm,
sts_position,
sts_type,
evt_start_dt,
lcinfo
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (imo, mvmn_type, mvmn_dt, fclty_id)
DO UPDATE SET
mvmn_type = EXCLUDED.mvmn_type,
mvmn_dt = EXCLUDED.mvmn_dt,
fclty_id = EXCLUDED.fclty_id,
fclty_nm = EXCLUDED.fclty_nm,
fclty_type = EXCLUDED.fclty_type,
up_fclty_id = EXCLUDED.up_fclty_id,
up_fclty_nm = EXCLUDED.up_fclty_nm,
up_fclty_type = EXCLUDED.up_fclty_type,
draft = EXCLUDED.draft,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
prnt_call_id = EXCLUDED.prnt_call_id,
ntn_cd = EXCLUDED.ntn_cd,
ntn_nm = EXCLUDED.ntn_nm,
sts_location = EXCLUDED.sts_location,
sts_type = EXCLUDED.sts_type,
evt_start_dt = EXCLUDED.evt_start_dt,
lcinfo = EXCLUDED.lcinfo
""";
event_sta_dt,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}
@Override
@ -122,6 +110,8 @@ public class StsOperationRepositoryImpl extends BaseJdbcRepository<StsOperationE
} else {
ps.setNull(i++, java.sql.Types.OTHER);
}
ps.setObject(i++, e.getJobExecutionId(), Types.INTEGER);
ps.setString(i++, e.getCreatedBy());
}
private void setDoubleOrNull(PreparedStatement ps, int index, Double value) throws Exception {

Some files were not shown because too many files have changed in this diff Show More