Compare commits

..

111 커밋

작성자 SHA1 메시지 날짜
cfc80bbb0d feat: Gitea 팀 프로젝트 워크플로우 구조 적용
- .claude/rules/: 팀 정책, Git 워크플로우, 코드 스타일, 네이밍, 테스트 규칙
- .claude/skills/: init-project, sync-team-workflow, create-mr, fix-issue
- .claude/settings.json: deny 규칙 + hooks
- .claude/workflow-version.json: v1.2.0 적용
- .githooks/: commit-msg(grep -P→-E macOS 호환), pre-commit, post-checkout
- .editorconfig, .sdkmanrc, .mvn/settings.xml (Nexus 미러)
- .gitignore: .claude/ 팀 파일 추적 전환
- CLAUDE.md: 프로젝트 루트로 이동

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-14 22:00:24 +09:00
0743fd4322 chore: 불필요 스크립트 삭제
- scripts/collect_signalkind_candidates.sh 제거

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-14 21:54:29 +09:00
82d427bda2 chore: 불필요 문서 삭제
- DEVELOPMENT_GUIDE.md (49KB) 삭제 - CLAUDE.md로 대체
- SWAGGER_GUIDE.md (16KB) 삭제 - Swagger 자동 생성으로 대체

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-14 21:53:06 +09:00
290933f94f chore: Kafka topic명 변경 및 SignalKind 수집 스크립트 추가
- tp_SNP_AIS_Signal → tp_Global_AIS_Signal (3개 프로파일)
- scripts/collect_signalkind_candidates.sh 추가

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-14 21:52:50 +09:00
LHT
178ac506bf feat: add AIS target Kafka producer pipeline 2026-02-13 03:10:38 +09:00
hyojin kim
07368f18cb 🔥 application.yml 설정 변경 2026-02-12 10:41:27 +09:00
hyojin-kim4
a93942d4d6
🔀 용어 표준화 반영 (AIS 제외) (#6)
* 🔧 Schema/Table 하드코딩 제거

* 🔥 BatchSchemaProperties.java 제거 및 @Value Schema 설정 방식 통일

* 🗃️ 용어 표준화

- Facility Port 
- Common Code
- Risk&Compliance
- Movement 
- Event 
- PSC 
- 선박제원정보
2026-02-12 10:27:22 +09:00
hyojin-kim4
f53648290c
🔀 데이터 값 검증 컬럼 추가 (#4)
* 🗃️ PSC : 값 검증 컬럼 추가

* 🗃️ Facility : 값 검증 컬럼 추가

* 🔊 Facility : API Request 로그 추가

* 🗃️ Event : 값 검증 컬럼 추가

* 🗃️ Movement : 값 검증 컬럼 추가

* 🗃️ 공통코드 : 값 검증 컬럼 추가, API 로그 서비스 추가

* 🗃️ IMO 메타 수집 : 값 검증 컬럼 추가, API 로그 서비스 추가

* 🗃️ Risk&Compliance : 값 검증 컬럼 추가

* 🗃️ 선박제원정보 : 값 검증 컬럼 추가, 해시값 비교 프로세스 제거

* 🗃️ schema change : snp_data -> t_snp_data
2026-02-05 18:49:27 +09:00
hyojin kim
6555c5e28f Merge branch 'main' into develop 2026-01-23 15:06:58 +09:00
hyojin kim
3cbc2d2e94 Merge branch 'dev_movements' into develop 2026-01-21 14:36:14 +09:00
hyojin kim
a59c91ae1f Merge branch 'dev_psc' into develop 2026-01-21 14:36:07 +09:00
hyojin kim
30304de4e6 🗃️ ship_detail_data,additionalshipsdata : datasetversion 컬럼 수집 추가 2026-01-21 14:31:56 +09:00
hyojin kim
7a1b24e381 🗃️ Dark Activity Confirmed : area_country 컬럼 수집 추가 2026-01-21 13:30:26 +09:00
hyojin kim
8d2cd09725 🗃️ PSC 수집 제외 컬럼 반영 2026-01-21 13:20:53 +09:00
hyojin kim
6c4ce9a536 🗃️ Terminal Call 수집 누락 컬럼 추가 2026-01-21 11:17:42 +09:00
hyojin kim
9fed34e1bc 🔥 Risk&Compliance Current/History 수집 방식 변경 2026-01-20 10:09:59 +09:00
hyojin kim
21368ffaff 🐛 Insert 쿼리 오류 수정 2026-01-19 15:30:13 +09:00
hyojin kim
7ab53d1bbf 🔥 선박제원정보의 Company Compliance 수집 제거 2026-01-19 10:49:54 +09:00
hyojin kim
613980c496 🔥 선박제원정보의 Company Compliance 수집 제거 2026-01-19 09:43:33 +09:00
hyojin kim
e63607a69d Company Compliance 수집 JOB 추가 2026-01-16 17:12:04 +09:00
hyojin kim
f4421fa455 선박제원정보 요청 단위 변경 2026-01-16 14:17:06 +09:00
hyojin kim
43057d74fb Company Detail 수집 프로세스 추가 2026-01-16 14:15:00 +09:00
hyojin kim
64a3a55e78 batch_api_log 관리 프로세스 추가 2026-01-15 15:58:20 +09:00
hyojin kim
f2c4e0d14f 🔇 Web Services API Log Control 2026-01-12 15:11:05 +09:00
hyojin kim
5305f61a41 🔇 Ships API Log Control 2026-01-12 14:41:08 +09:00
hyojin kim
c3dabd370c Merge branch 'develop' into dev_shipdetail_sync 2026-01-09 16:07:28 +09:00
hyojin kim
9c021f298c Add Ship Detail Sync Job 2026-01-09 16:07:00 +09:00
hyojin kim
cbb53fd9f1 🗃️ Core 캐시 대상 변경 2026-01-09 14:59:20 +09:00
49d2de1965 AIS Target DB Sync Job 분리 (캐시→DB 15분 주기)
- AisTargetDataWriter: DB 저장 제거, 캐시 업데이트만 수행
- AisTargetDbSyncJob 신규 생성: 15분 주기 캐시→DB 동기화
- AisTargetDbSyncTasklet: 캐시에서 최근 15분 데이터 조회 후 UPSERT
- application.yml: ais-target-db-sync 설정 추가

데이터 흐름 변경:
- 기존: API(1분) → 캐시 + DB (매분 33K 건 저장)
- 변경: API(1분) → 캐시만, DB는 15분마다 MMSI별 최신 1건 저장

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-09 14:25:27 +09:00
hyojin kim
1ab78e881f 🔊 API Response Error Log Update 2026-01-09 13:39:18 +09:00
hyojin kim
4e79794750 chunk & batch size change 2026-01-09 10:21:10 +09:00
hyojin kim
abe5ea1a1c Merge branch 'dev_batchflag' into develop 2026-01-08 15:59:01 +09:00
hyojin kim
d8b8a40316 🗃️ remove batch_flag of new_snp schema 2026-01-08 15:57:46 +09:00
hyojin kim
b842ec8d54 🗃️ Crew List Unique Index Change 2026-01-08 15:28:03 +09:00
hyojin kim
e1fa48768e 💥 API 조회 기간 세팅 방식 변경 및 통일 2026-01-08 15:12:06 +09:00
hyojin kim
87a9217853 🗃️ ais_target ddl update 2026-01-07 13:18:10 +09:00
hyojin kim
6e70e921af 🗃️ AIS Target 변경으로 인한 데이터 및 컬럼추가 2026-01-05 17:42:53 +09:00
hyojin kim
3fb133e367 🗃️ core20 컬럼 추가 : AIS 추가 컬럼 2026-01-05 15:04:07 +09:00
hyojin kim
31262f5dda 🔇 로그 범위 변경 2025-12-31 13:59:23 +09:00
hyojin kim
99fcd38d24 🗃️ procedure change 2025-12-31 12:38:07 +09:00
hyojin kim
7360736cb0 🏗️ Movement Batch Package Rearrange 2025-12-31 10:53:31 +09:00
hyojin kim
6aba0f55b0 🗃️ Event Table Name Change
- SQL Injection Prevent
2025-12-31 10:37:20 +09:00
hyojin kim
1d2a3c53c8 Add Compliance History Value Change Manage Step 2025-12-31 09:59:25 +09:00
hyojin kim
020f16035b Merge branch 'develop' of https://github.com/GC-IncheonService-KDN/SNP-Batch into develop 2025-12-29 18:02:31 +09:00
hyojin kim
94f7d4b5c0 🔨 Multi Step Job Config 추가 2025-12-29 18:02:18 +09:00
Kim JiMyeung
0a5e2e56af Batch 파라미터 request 적용 2025-12-29 15:35:18 +09:00
hyojin kim
32af369f23 🗃️ Last Postion Update 대상 스키마 변경 2025-12-24 14:24:17 +09:00
hyojin kim
fcf1d74c38 Risk&Compliance Range Import Update 2025-12-24 14:15:13 +09:00
hyojin kim
5683000024 Merge branch 'dev_event' into develop 2025-12-23 14:39:43 +09:00
Kim JiMyeung
a7cf1647f8 event속성들 snp_data 적재 -> new_snp 적재 2025-12-23 14:33:53 +09:00
hyojin kim
6d7b7c9eea Merge branch 'dev_event' into develop 2025-12-23 12:36:48 +09:00
hyojin kim
6885d41ba5 Merge branch 'dev_shipdetail' of https://github.com/GC-IncheonService-KDN/SNP-Batch into dev_shipdetail 2025-12-23 12:35:13 +09:00
hyojin kim
7b1fe1d52c 🗃️ Ship Data 스키마 변경 2025-12-23 12:33:10 +09:00
hyojin kim
bff4de17c7 🗃️ chunk size change 2025-12-23 11:28:17 +09:00
hyojin kim
bda2d812ff 🗃️ Ship Data 스키마 변경 2025-12-23 11:23:29 +09:00
Kim JiMyeung
1124c2e84a risk, compliance잡 range형태로 수정 2025-12-23 09:42:50 +09:00
Kim JiMyeung
75531ab5e5 startDate, endDate로직처리 2025-12-22 13:11:25 +09:00
hyojin kim
4700ec862b 💩 임시커밋 2025-12-19 17:13:40 +09:00
Kim JiMyeung
e7ea47b02c Merge branch 'dev_movement_daterange' into dev_event 2025-12-19 13:59:38 +09:00
Kim JiMyeung
63e9253d7f Movement Method Range형식으로 변경 2025-12-19 13:37:35 +09:00
hyojin kim
acd76bd358 Event Detail 적재 프로세스 개발
- StartDate, EndDate 추출작업 필요
2025-12-19 10:57:40 +09:00
hyojin kim
270b2a0b55 ⚰️ 불필요한 주석 제거 2025-12-16 16:02:08 +09:00
hyojin kim
084be88b98 S&P 국가코드,선박유형코드 Import Job 2025-12-16 15:56:02 +09:00
hyojin kim
fb10e3cc39 🦖 선박제원정보 테이블 변경 core20 > ship_detail_data 2025-12-16 10:20:46 +09:00
hyojin kim
b2167d4ec7 Event Range 세팅방식 변경
- API_KET 세팅방식 변경
2025-12-15 13:31:42 +09:00
hyojin kim
630c366a06 Merge branch 'dev_ship_movement' into develop 2025-12-15 10:16:25 +09:00
Kim JiMyeung
e7f4a9d912 AnchorageCalls, Berthcalls, DarkActivity, StsOperations, TerminalCalls Job 개발 2025-12-15 10:09:18 +09:00
hyojin kim
1c491de9e2 🗃️ application.xml 수정 2025-12-12 15:34:02 +09:00
Kim JiMyeung
3118df3533 Merge remote-tracking branch 'origin/develop' into dev_ship_movement 2025-12-12 14:48:49 +09:00
hyojin kim
090f009529 ShipDetailUpdateJob 개발
- CrewList
- StowageCommodity
- GroupBeneficialOwnerHistory
- ShipManagerHistory
- OperatorHistory
- TechnicalManagerHistory
- BareBoatCharterHistory
- NameHistory
- FlagHistory
- AdditionalInformation
- PandIHistory
- CallSignAndMmsiHistory
- IceClass
- SafetyManagementCertificateHistory
- ClassHistory
- SurveyDatesHistory
- SurveyDatesHistoryUnique
- SisterShipLinks
- StatusHistory
- SpecialFeature
- Thrusters
2025-12-12 13:12:40 +09:00
Kim JiMyeung
c46a62268c reader 수정 2025-12-12 11:20:13 +09:00
Kim JiMyeung
f2970872fd mvmn_type on conflict추가 2025-12-12 11:14:10 +09:00
Kim JiMyeung
ac78a1340a Merge branch 'dev_ship_movement' of https://github.com/GC-IncheonService-KDN/SNP-Batch into dev_ship_movement 2025-12-11 16:31:18 +09:00
Kim JiMyeung
3ee6ae1bf7 pscJob 2025-12-11 16:29:28 +09:00
hyojin kim
2a0a80098d Merge branch 'develop' into dev_ship_movement 2025-12-10 12:33:57 +09:00
hyojin kim
eb81be5f21 🗃️ application.xml 정리 2025-12-10 10:54:44 +09:00
hyojin kim
655318e353 🗃️ Risk&Compliance 적재방식 변경 (이력데이터 적재) 2025-12-10 10:13:09 +09:00
hyojin kim
2e509560de Merge branch 'ais/ship_position' into develop 2025-12-10 08:54:42 +09:00
fedd89c9ca [수정]
- GPU DB core20 테이블 정보 프로파일 추가
2025-12-10 08:46:15 +09:00
3dde3d0167 [추가]
- 실시간 선박 위치 조회 API Classtype 구분 파라미터 추가 (core20 테이블 imo 유무로 ClassA, ClassB 분류)
 - html PUT,DELETE, PATCH 메소드 제거 및 POST 메소드 사용 변경 (보안이슈)
2025-12-10 08:14:28 +09:00
Kim JiMyeung
6c98ebc24f Destination, Transits, CurrentlyAt 증분Job 2025-12-08 17:47:30 +09:00
Kim JiMyeung
18ab11068a 빈 배열 처리 로직추가 2025-12-08 13:33:57 +09:00
hyojin kim
37f61fe924 Add Port Import Job, Event Import Job 2025-12-08 13:33:37 +09:00
hyojin kim
e9b30f8817 🗃️ JPA 스키마 지정 (snp_data) 2025-12-08 13:33:23 +09:00
Kim JiMyeung
34ce85f33f Merge remote-tracking branch 'origin/develop' into dev_ship_movement 2025-12-08 13:17:06 +09:00
Kim JiMyeung
919b0fc21a AnchorageCalls, Berthcalls, DarkActivity, StsOperations, TerminalCalls 증분Job 2025-12-08 13:00:08 +09:00
Kim JiMyeung
7941396d62 ais/ship_position into dev_ship_movement 2025-12-05 11:00:28 +09:00
Kim JiMyeung
248e9c2c46 /snp-asi url추가 2025-12-05 10:17:08 +09:00
Kim JiMyeung
2671d613f3 merge devlop into dev_ship_movement 2025-12-05 09:44:20 +09:00
hyojin kim
1b7fa47dbd Merge branch 'ais/ship_position' into develop 2025-12-05 09:33:59 +09:00
8d8ea53449 [추가]
- 프로세스 재기동 등으로 정상 종료되지 않은 Job 정리용 임시 sql 추가
2025-12-05 08:31:11 +09:00
322ecb12a6 [수정]
- url 하드코딩 제거
- bootstrap 로컬 저장, 참조수정
2025-12-04 15:38:01 +09:00
55d4dd5886 [수정]
- 파티션 관리 job 추가 (+3일 미리 생성, 14일 이전 파티션 자동drop 설정)
- (임시) GPU 운영 포트 9000번 변경
- ais_target 테이블 일일 파티션구조로 변경 (1일 데이터 약 20GB)
2025-12-04 13:05:00 +09:00
hyojin kim
c842e982c8 Merge branch 'dev_ship_movement' into dev_ship_detail
# Conflicts:
#	src/main/java/com/snp/batch/global/config/MaritimeApiWebClientConfig.java
2025-12-02 19:11:29 +09:00
hyojin kim
44ae82e2fa Merge branch 'ais/ship_position' into dev_ship_detail
# Conflicts:
#	src/main/java/com/snp/batch/jobs/sanction/batch/reader/ComplianceDataReader.java
#	src/main/resources/application.yml
2025-12-02 19:10:15 +09:00
hyojin kim
d6cf58d737 Add Port Import Job, Event Import Job 2025-12-02 18:26:54 +09:00
5857a4a822 [수정]
- 항해 조건 필터 검색 API (SOG/COG/Heading/Destination/Status)
- Swagger Status 필터 현행화
Under way sailing
N/A
AIS Sart
Restriced manoeuverability
Not under command
Engaged in fishing
Under way using engine
Anchored
Constrained by draught
Aground
Power Driven Towing Alongside
Power Driven Towing Astern
Moored
2025-12-02 16:44:14 +09:00
6af2fccbf0 [신규 기능]
- aisTargetImportJob: S&P Global AIS API 연동 (매 분 15초)
- AIS Target 조회 API (MMSI/시간/공간/폴리곤/WKT 검색)
- 항해 조건 필터 검색 API (SOG/COG/Heading/Destination/Status)
- Caffeine 캐시 적용 (TTL 120분, 최대 30만건)
- partitionManagerJob: 매일 1회 일별,월별 파티션 자동 생성

[개선]
- API context-path: /snp-api로 변경 (다른 API 서비스의 Proxy 설정 충돌 방지)
- BaseApiReader 상태 초기화 로직 추가 (재실행 시 0건 버그 수정)
- logback-spring.xml: 로그 파일 분리 및 롤링 정책 적용
2025-12-02 16:24:57 +09:00
Kim JiMyeung
c99b6993a7 빈 배열 처리 로직추가 2025-12-02 12:53:17 +09:00
hyojin kim
b3cb4f6f19 🗃️ JPA 스키마 지정 (snp_data) 2025-12-02 12:26:49 +09:00
hyojin kim
4282fc9106 🗃️ Risk&Compliance batch_flag 추가 2025-11-28 18:21:21 +09:00
hyojin kim
8a3e9a973e 🗃️ Risk&Compliance 인덱스 변경 반영 2025-11-28 10:46:44 +09:00
hyojin kim
68893f9657 🛂 운영서버 요청 URL 변경 2025-11-28 10:43:10 +09:00
hyojin kim
5787fb5be0 Merge branch 'dev_ship_movement' into dev_ship_detail 2025-11-27 22:20:34 +09:00
hyojin kim
4ed1070a37 Merge branch 'dev_ship_movement' into dev_ship_detail
# Conflicts:
#	src/main/java/com/snp/batch/global/config/MaritimeApiWebClientConfig.java
2025-11-27 22:20:21 +09:00
hyojin kim
f9b20bdc59 🗃️ 운영접속주소 수정 2025-11-27 22:03:09 +09:00
hyojin kim
7a405bb969 swagger 운영 주소 추가 2025-11-27 22:00:26 +09:00
hyojin kim
906611c9b8 Risk&Compliance Data Import Job 개발 2025-11-27 21:55:46 +09:00
Kim JiMyeung
e44637e1f3 movement 배치 2025-11-27 16:20:05 +09:00
hyojin kim
6be90723b4 Core20 테이블 AIS 컬럼 추가 (COG, NavStat) 2025-11-25 18:39:30 +09:00
hyojin kim
18fa95e903 🩹 OwnerHistory DataSetVersion 하드코딩 제거 2025-11-24 10:43:43 +09:00
98개의 변경된 파일2868개의 추가작업 그리고 4150개의 파일을 삭제

파일 보기

@ -0,0 +1,59 @@
# Java 코드 스타일 규칙
## 일반
- Java 17+ 문법 사용 (record, sealed class, pattern matching, text block 활용)
- 들여쓰기: 4 spaces (탭 사용 금지)
- 줄 길이: 120자 이하
- 파일 끝에 빈 줄 추가
## 클래스 구조
클래스 내 멤버 순서:
1. static 상수 (public → private)
2. 인스턴스 필드 (public → private)
3. 생성자
4. public 메서드
5. protected/package-private 메서드
6. private 메서드
7. inner class/enum
## Spring Boot 규칙
### 계층 구조
- Controller → Service → Repository 단방향 의존
- Controller에 비즈니스 로직 금지 (요청/응답 변환만)
- Service 계층 간 순환 참조 금지
- Repository에 비즈니스 로직 금지
### DTO와 Entity 분리
- API 요청/응답에 Entity 직접 사용 금지
- DTO는 record 또는 불변 클래스로 작성
- DTO ↔ Entity 변환은 매퍼 클래스 또는 팩토리 메서드 사용
### 의존성 주입
- 생성자 주입 사용 (필드 주입 `@Autowired` 사용 금지)
- 단일 생성자는 `@Autowired` 어노테이션 생략
- Lombok `@RequiredArgsConstructor` 사용 가능
### 트랜잭션
- `@Transactional` 범위 최소화
- 읽기 전용: `@Transactional(readOnly = true)`
- Service 메서드 레벨에 적용 (클래스 레벨 지양)
## Lombok 규칙
- `@Getter`, `@Setter` 허용 (Entity에서 Setter는 지양)
- `@Builder` 허용
- `@Data` 사용 금지 (명시적으로 필요한 어노테이션만)
- `@AllArgsConstructor` 단독 사용 금지 (`@Builder`와 함께 사용)
- `@Slf4j` 로거 사용
## 예외 처리
- 비즈니스 예외는 커스텀 Exception 클래스 정의
- `@ControllerAdvice`로 전역 예외 처리
- 예외 메시지에 컨텍스트 정보 포함
- catch 블록에서 예외 무시 금지 (`// ignore` 금지)
## 기타
- `Optional`은 반환 타입으로만 사용 (필드, 파라미터에 사용 금지)
- `null` 반환보다 빈 컬렉션 또는 `Optional` 반환
- Stream API 활용 (단, 3단계 이상 체이닝은 메서드 추출)
- 하드코딩된 문자열/숫자 금지 → 상수 또는 설정값으로 추출

파일 보기

@ -0,0 +1,84 @@
# Git 워크플로우 규칙
## 브랜치 전략
### 브랜치 구조
```
main ← 배포 가능한 안정 브랜치 (보호됨)
└── develop ← 개발 통합 브랜치
├── feature/ISSUE-123-기능설명
├── bugfix/ISSUE-456-버그설명
└── hotfix/ISSUE-789-긴급수정
```
### 브랜치 네이밍
- feature 브랜치: `feature/ISSUE-번호-간단설명` (예: `feature/ISSUE-42-user-login`)
- bugfix 브랜치: `bugfix/ISSUE-번호-간단설명`
- hotfix 브랜치: `hotfix/ISSUE-번호-간단설명`
- 이슈 번호가 없는 경우: `feature/간단설명` (예: `feature/add-swagger-docs`)
### 브랜치 규칙
- main, develop 브랜치에 직접 커밋/푸시 금지
- feature 브랜치는 develop에서 분기
- hotfix 브랜치는 main에서 분기
- 머지는 반드시 MR(Merge Request)을 통해 수행
## 커밋 메시지 규칙
### Conventional Commits 형식
```
type(scope): subject
body (선택)
footer (선택)
```
### type (필수)
| type | 설명 |
|------|------|
| feat | 새로운 기능 추가 |
| fix | 버그 수정 |
| docs | 문서 변경 |
| style | 코드 포맷팅 (기능 변경 없음) |
| refactor | 리팩토링 (기능 변경 없음) |
| test | 테스트 추가/수정 |
| chore | 빌드, 설정 변경 |
| ci | CI/CD 설정 변경 |
| perf | 성능 개선 |
### scope (선택)
- 변경 범위를 나타내는 짧은 단어
- 한국어, 영어 모두 허용 (예: `feat(인증): 로그인 기능`, `fix(auth): token refresh`)
### subject (필수)
- 변경 내용을 간결하게 설명
- 한국어, 영어 모두 허용
- 72자 이내
- 마침표(.) 없이 끝냄
### 예시
```
feat(auth): JWT 기반 로그인 구현
fix(배치): 야간 배치 타임아웃 수정
docs: README에 빌드 방법 추가
refactor(user-service): 중복 로직 추출
test(결제): 환불 로직 단위 테스트 추가
chore: Gradle 의존성 버전 업데이트
```
## MR(Merge Request) 규칙
### MR 생성
- 제목: 커밋 메시지와 동일한 Conventional Commits 형식
- 본문: 변경 내용 요약, 테스트 방법, 관련 이슈 번호
- 라벨: 적절한 라벨 부착 (feature, bugfix, hotfix 등)
### MR 리뷰
- 최소 1명의 리뷰어 승인 필수
- CI 검증 통과 필수 (설정된 경우)
- 리뷰 코멘트 모두 해결 후 머지
### MR 머지
- Squash Merge 권장 (깔끔한 히스토리)
- 머지 후 소스 브랜치 삭제

60
.claude/rules/naming.md Normal file
파일 보기

@ -0,0 +1,60 @@
# Java 네이밍 규칙
## 패키지
- 모두 소문자, 단수형
- 도메인 역순: `com.gcsc.프로젝트명.모듈`
- 예: `com.gcsc.batch.scheduler`, `com.gcsc.api.auth`
## 클래스
- PascalCase
- 명사 또는 명사구
- 접미사로 역할 표시:
| 계층 | 접미사 | 예시 |
|------|--------|------|
| Controller | `Controller` | `UserController` |
| Service | `Service` | `UserService` |
| Service 구현 | `ServiceImpl` | `UserServiceImpl` (인터페이스 있을 때만) |
| Repository | `Repository` | `UserRepository` |
| Entity | (없음) | `User`, `ShipRoute` |
| DTO 요청 | `Request` | `CreateUserRequest` |
| DTO 응답 | `Response` | `UserResponse` |
| 설정 | `Config` | `SecurityConfig` |
| 예외 | `Exception` | `UserNotFoundException` |
| Enum | (없음) | `UserStatus`, `ShipType` |
| Mapper | `Mapper` | `UserMapper` |
## 메서드
- camelCase
- 동사로 시작
- CRUD 패턴:
| 작업 | Controller | Service | Repository |
|------|-----------|---------|------------|
| 조회(단건) | `getUser()` | `getUser()` | `findById()` |
| 조회(목록) | `getUsers()` | `getUsers()` | `findAll()` |
| 생성 | `createUser()` | `createUser()` | `save()` |
| 수정 | `updateUser()` | `updateUser()` | `save()` |
| 삭제 | `deleteUser()` | `deleteUser()` | `deleteById()` |
| 존재확인 | - | `existsUser()` | `existsById()` |
## 변수
- camelCase
- 의미 있는 이름 (단일 문자 변수 금지, 루프 인덱스 `i, j, k` 예외)
- boolean: `is`, `has`, `can`, `should` 접두사
- 예: `isActive`, `hasPermission`, `canDelete`
## 상수
- UPPER_SNAKE_CASE
- 예: `MAX_RETRY_COUNT`, `DEFAULT_PAGE_SIZE`
## 테스트
- 클래스: `{대상클래스}Test` (예: `UserServiceTest`)
- 메서드: `{메서드명}_{시나리오}_{기대결과}` 또는 한국어 `@DisplayName`
- 예: `createUser_withDuplicateEmail_throwsException()`
- 예: `@DisplayName("중복 이메일로 생성 시 예외 발생")`
## 파일/디렉토리
- Java 파일: PascalCase (클래스명과 동일)
- 리소스 파일: kebab-case (예: `application-local.yml`)
- SQL 파일: `V{번호}__{설명}.sql` (Flyway) 또는 kebab-case

파일 보기

@ -0,0 +1,34 @@
# 팀 정책 (Team Policy)
이 규칙은 조직 전체에 적용되는 필수 정책입니다.
프로젝트별 `.claude/rules/`에 추가 규칙을 정의할 수 있으나, 이 정책을 위반할 수 없습니다.
## 보안 정책
### 금지 행위
- `.env`, `.env.*`, `secrets/` 파일 읽기 및 내용 출력 금지
- 비밀번호, API 키, 토큰 등 민감 정보를 코드에 하드코딩 금지
- `git push --force`, `git reset --hard`, `git clean -fd` 실행 금지
- `rm -rf /`, `rm -rf ~`, `rm -rf .git` 등 파괴적 명령 실행 금지
- main/develop 브랜치에 직접 push 금지 (MR을 통해서만 머지)
### 인증 정보 관리
- 환경변수 또는 외부 설정 파일(`.env`, `application-local.yml`)로 관리
- 설정 파일은 `.gitignore`에 반드시 포함
- 예시 파일(`.env.example`, `application.yml.example`)만 커밋
## 코드 품질 정책
### 필수 검증
- 커밋 전 빌드(컴파일) 성공 확인
- 린트 경고 0개 유지 (CI에서도 검증)
- 테스트 코드가 있는 프로젝트는 테스트 통과 필수
### 코드 리뷰
- main 브랜치 머지 시 최소 1명 리뷰 필수
- 리뷰어 승인 없이 머지 불가
## 문서화 정책
- 공개 API(controller endpoint)에는 반드시 설명 주석 작성
- 복잡한 비즈니스 로직에는 의도를 설명하는 주석 작성
- README.md에 프로젝트 빌드/실행 방법 유지

62
.claude/rules/testing.md Normal file
파일 보기

@ -0,0 +1,62 @@
# Java 테스트 규칙
## 테스트 프레임워크
- JUnit 5 + AssertJ 조합
- Mockito로 의존성 모킹
- Spring Boot Test (`@SpringBootTest`) 는 통합 테스트에만 사용
## 테스트 구조
### 단위 테스트 (Unit Test)
- Service, Util, Domain 로직 테스트
- Spring 컨텍스트 로딩 없이 (`@ExtendWith(MockitoExtension.class)`)
- 외부 의존성은 Mockito로 모킹
```java
@ExtendWith(MockitoExtension.class)
class UserServiceTest {
@InjectMocks
private UserService userService;
@Mock
private UserRepository userRepository;
@Test
@DisplayName("사용자 생성 시 정상 저장")
void createUser_withValidInput_savesUser() {
// given
// when
// then
}
}
```
### 통합 테스트 (Integration Test)
- Controller 테스트: `@WebMvcTest` + `MockMvc`
- Repository 테스트: `@DataJpaTest`
- 전체 플로우: `@SpringBootTest` (최소화)
### 테스트 패턴
- **Given-When-Then** 구조 사용
- 각 섹션을 주석으로 구분
- 하나의 테스트에 하나의 검증 원칙 (가능한 범위에서)
## 테스트 네이밍
- 메서드명: `{메서드}_{시나리오}_{기대결과}` 패턴
- `@DisplayName`: 한국어로 테스트 의도 설명
## 테스트 커버리지
- 새로 작성하는 Service 클래스: 핵심 비즈니스 로직 테스트 필수
- 기존 코드 수정 시: 수정된 로직에 대한 테스트 추가 권장
- Controller: 주요 API endpoint 통합 테스트 권장
## 테스트 데이터
- 테스트 데이터는 테스트 메서드 내부 또는 `@BeforeEach`에서 생성
- 공통 테스트 데이터는 TestFixture 클래스로 분리
- 실제 DB 연결 필요 시 H2 인메모리 또는 Testcontainers 사용
## 금지 사항
- `@SpringBootTest`를 단위 테스트에 사용 금지
- 테스트 간 상태 공유 금지
- `Thread.sleep()` 사용 금지 → `Awaitility` 사용
- 실제 외부 API 호출 금지 → WireMock 또는 Mockito 사용

78
.claude/settings.json Normal file
파일 보기

@ -0,0 +1,78 @@
{
"$schema": "https://json.schemastore.org/claude-code-settings.json",
"permissions": {
"allow": [
"Bash(./mvnw *)",
"Bash(mvn *)",
"Bash(java -version)",
"Bash(git status)",
"Bash(git diff*)",
"Bash(git log*)",
"Bash(git branch*)",
"Bash(git checkout*)",
"Bash(git add*)",
"Bash(git commit*)",
"Bash(git pull*)",
"Bash(git fetch*)",
"Bash(git merge*)",
"Bash(git stash*)",
"Bash(git remote*)",
"Bash(git config*)",
"Bash(git rev-parse*)",
"Bash(git show*)",
"Bash(git tag*)",
"Bash(curl -s *)",
"Bash(sdk *)"
],
"deny": [
"Bash(git push --force*)",
"Bash(git reset --hard*)",
"Bash(git clean -fd*)",
"Bash(git checkout -- .)",
"Bash(rm -rf /)",
"Bash(rm -rf ~)",
"Bash(rm -rf .git*)",
"Bash(rm -rf /*)",
"Read(./**/.env*)",
"Read(./**/secrets/**)",
"Read(./**/application-local.yml)"
]
},
"hooks": {
"SessionStart": [
{
"matcher": "compact",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-post-compact.sh",
"timeout": 10
}
]
}
],
"PreCompact": [
{
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-pre-compact.sh",
"timeout": 30
}
]
}
],
"PostToolUse": [
{
"matcher": "Bash",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-commit.sh",
"timeout": 15
}
]
}
]
}
}

파일 보기

@ -0,0 +1,65 @@
---
name: create-mr
description: 현재 브랜치에서 Gitea MR(Merge Request)을 생성합니다
allowed-tools: "Bash, Read, Grep"
argument-hint: "[target-branch: develop|main] (기본: develop)"
---
현재 브랜치의 변경 사항을 기반으로 Gitea에 MR을 생성합니다.
타겟 브랜치: $ARGUMENTS (기본: develop)
## 수행 단계
### 1. 사전 검증
- 현재 브랜치가 main/develop이 아닌지 확인
- 커밋되지 않은 변경 사항 확인 (있으면 경고)
- 리모트에 현재 브랜치가 push되어 있는지 확인 (안 되어 있으면 push)
### 2. 변경 내역 분석
```bash
git log develop..HEAD --oneline
git diff develop..HEAD --stat
```
- 커밋 목록과 변경된 파일 목록 수집
- 주요 변경 사항 요약 작성
### 3. MR 정보 구성
- **제목**: 브랜치의 첫 커밋 메시지 또는 브랜치명에서 추출
- `feature/ISSUE-42-user-login``feat: ISSUE-42 user-login`
- **본문**:
```markdown
## 변경 사항
- (커밋 기반 자동 생성)
## 관련 이슈
- closes #이슈번호 (브랜치명에서 추출)
## 테스트
- [ ] 빌드 성공 확인
- [ ] 기존 테스트 통과
```
### 4. Gitea API로 MR 생성
```bash
# Gitea remote URL에서 owner/repo 추출
REMOTE_URL=$(git remote get-url origin)
# Gitea API 호출
curl -X POST "GITEA_URL/api/v1/repos/{owner}/{repo}/pulls" \
-H "Authorization: token ${GITEA_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"title": "MR 제목",
"body": "MR 본문",
"head": "현재브랜치",
"base": "타겟브랜치"
}'
```
### 5. 결과 출력
- MR URL 출력
- 리뷰어 지정 안내
- 다음 단계: 리뷰 대기 → 승인 → 머지
## 필요 환경변수
- `GITEA_TOKEN`: Gitea API 접근 토큰 (없으면 안내)

파일 보기

@ -0,0 +1,49 @@
---
name: fix-issue
description: Gitea 이슈를 분석하고 수정 브랜치를 생성합니다
allowed-tools: "Bash, Read, Write, Edit, Glob, Grep"
argument-hint: "<issue-number>"
---
Gitea 이슈 #$ARGUMENTS 를 분석하고 수정 작업을 시작합니다.
## 수행 단계
### 1. 이슈 조회
```bash
curl -s "GITEA_URL/api/v1/repos/{owner}/{repo}/issues/$ARGUMENTS" \
-H "Authorization: token ${GITEA_TOKEN}"
```
- 이슈 제목, 본문, 라벨, 담당자 정보 확인
- 이슈 내용을 사용자에게 요약하여 보여줌
### 2. 브랜치 생성
이슈 라벨에 따라 브랜치 타입 결정:
- `bug` 라벨 → `bugfix/ISSUE-번호-설명`
- 그 외 → `feature/ISSUE-번호-설명`
- 긴급 → `hotfix/ISSUE-번호-설명`
```bash
git checkout develop
git pull origin develop
git checkout -b {type}/ISSUE-{number}-{slug}
```
### 3. 이슈 분석
이슈 내용을 바탕으로:
- 관련 파일 탐색 (Grep, Glob 활용)
- 영향 범위 파악
- 수정 방향 제안
### 4. 수정 계획 제시
사용자에게 수정 계획을 보여주고 승인을 받은 후 작업 진행:
- 수정할 파일 목록
- 변경 내용 요약
- 예상 영향
### 5. 작업 완료 후
- 변경 사항 요약
- `/create-mr` 실행 안내
## 필요 환경변수
- `GITEA_TOKEN`: Gitea API 접근 토큰

파일 보기

@ -0,0 +1,246 @@
---
name: init-project
description: 팀 표준 워크플로우로 프로젝트를 초기화합니다
allowed-tools: "Bash, Read, Write, Edit, Glob, Grep"
argument-hint: "[project-type: java-maven|java-gradle|react-ts|auto]"
---
팀 표준 워크플로우에 따라 프로젝트를 초기화합니다.
프로젝트 타입: $ARGUMENTS (기본: auto — 자동 감지)
## 프로젝트 타입 자동 감지
$ARGUMENTS가 "auto"이거나 비어있으면 다음 순서로 감지:
1. `pom.xml` 존재 → **java-maven**
2. `build.gradle` 또는 `build.gradle.kts` 존재 → **java-gradle**
3. `package.json` + `tsconfig.json` 존재 → **react-ts**
4. 감지 실패 → 사용자에게 타입 선택 요청
## 수행 단계
### 1. 프로젝트 분석
- 빌드 파일, 설정 파일, 디렉토리 구조 파악
- 사용 중인 프레임워크, 라이브러리 감지
- 기존 `.claude/` 디렉토리 존재 여부 확인
- eslint, prettier, checkstyle, spotless 등 lint 도구 설치 여부 확인
### 2. CLAUDE.md 생성
프로젝트 루트에 CLAUDE.md를 생성하고 다음 내용 포함:
- 프로젝트 개요 (이름, 타입, 주요 기술 스택)
- 빌드/실행 명령어 (감지된 빌드 도구 기반)
- 테스트 실행 명령어
- lint 실행 명령어 (감지된 도구 기반)
- 프로젝트 디렉토리 구조 요약
- 팀 컨벤션 참조 (`.claude/rules/` 안내)
### Gitea 파일 다운로드 URL 패턴
⚠️ Gitea raw 파일은 반드시 **web raw URL**을 사용해야 합니다 (`/api/v1/` 경로 사용 불가):
```bash
GITEA_URL="${GITEA_URL:-https://gitea.gc-si.dev}"
# common 파일: ${GITEA_URL}/gc/template-common/raw/branch/develop/<파일경로>
# 타입별 파일: ${GITEA_URL}/gc/template-<타입>/raw/branch/develop/<파일경로>
# 예시:
curl -sf "${GITEA_URL}/gc/template-common/raw/branch/develop/.claude/rules/team-policy.md"
curl -sf "${GITEA_URL}/gc/template-react-ts/raw/branch/develop/.editorconfig"
```
### 3. .claude/ 디렉토리 구성
이미 팀 표준 파일이 존재하면 건너뜀. 없는 경우 위의 URL 패턴으로 Gitea에서 다운로드:
- `.claude/settings.json` — 프로젝트 타입별 표준 권한 설정 + hooks 섹션 (4단계 참조)
- `.claude/rules/` — 팀 규칙 파일 (team-policy, git-workflow, code-style, naming, testing)
- `.claude/skills/` — 팀 스킬 (create-mr, fix-issue, sync-team-workflow, init-project)
### 4. Hook 스크립트 생성
`.claude/scripts/` 디렉토리를 생성하고 다음 스크립트 파일 생성 (chmod +x):
- `.claude/scripts/on-pre-compact.sh`:
```bash
#!/bin/bash
# PreCompact hook: systemMessage만 지원 (hookSpecificOutput 사용 불가)
INPUT=$(cat)
cat <<RESP
{
"systemMessage": "컨텍스트 압축이 시작됩니다. 반드시 다음을 수행하세요:\n\n1. memory/MEMORY.md - 핵심 작업 상태 갱신 (200줄 이내)\n2. memory/project-snapshot.md - 변경된 패키지/타입 정보 업데이트\n3. memory/project-history.md - 이번 세션 변경사항 추가\n4. memory/api-types.md - API 인터페이스 변경이 있었다면 갱신\n5. 미완료 작업이 있다면 TodoWrite에 남기고 memory에도 기록"
}
RESP
```
- `.claude/scripts/on-post-compact.sh`:
```bash
#!/bin/bash
INPUT=$(cat)
CWD=$(echo "$INPUT" | python3 -c "import sys,json;print(json.load(sys.stdin).get('cwd',''))" 2>/dev/null || echo "")
if [ -z "$CWD" ]; then
CWD=$(pwd)
fi
PROJECT_HASH=$(echo "$CWD" | sed 's|/|-|g')
MEMORY_DIR="$HOME/.claude/projects/$PROJECT_HASH/memory"
CONTEXT=""
if [ -f "$MEMORY_DIR/MEMORY.md" ]; then
SUMMARY=$(head -100 "$MEMORY_DIR/MEMORY.md" | python3 -c "import sys;print(sys.stdin.read().replace('\\\\','\\\\\\\\').replace('\"','\\\\\"').replace('\n','\\\\n'))" 2>/dev/null)
CONTEXT="컨텍스트가 압축되었습니다.\\n\\n[세션 요약]\\n${SUMMARY}"
fi
if [ -f "$MEMORY_DIR/project-snapshot.md" ]; then
SNAP=$(head -50 "$MEMORY_DIR/project-snapshot.md" | python3 -c "import sys;print(sys.stdin.read().replace('\\\\','\\\\\\\\').replace('\"','\\\\\"').replace('\n','\\\\n'))" 2>/dev/null)
CONTEXT="${CONTEXT}\\n\\n[프로젝트 최신 상태]\\n${SNAP}"
fi
if [ -n "$CONTEXT" ]; then
CONTEXT="${CONTEXT}\\n\\n위 내용을 참고하여 작업을 이어가세요. 상세 내용은 memory/ 디렉토리의 각 파일을 참조하세요."
echo "{\"hookSpecificOutput\":{\"additionalContext\":\"${CONTEXT}\"}}"
else
echo "{\"hookSpecificOutput\":{\"additionalContext\":\"컨텍스트가 압축되었습니다. memory 파일이 없으므로 사용자에게 이전 작업 내용을 확인하세요.\"}}"
fi
```
- `.claude/scripts/on-commit.sh`:
```bash
#!/bin/bash
INPUT=$(cat)
COMMAND=$(echo "$INPUT" | python3 -c "import sys,json;print(json.load(sys.stdin).get('tool_input',{}).get('command',''))" 2>/dev/null || echo "")
if echo "$COMMAND" | grep -qE 'git commit'; then
cat <<RESP
{
"hookSpecificOutput": {
"additionalContext": "커밋이 감지되었습니다. 다음을 수행하세요:\n1. docs/CHANGELOG.md에 변경 내역 추가\n2. memory/project-snapshot.md에서 변경된 부분 업데이트\n3. memory/project-history.md에 이번 변경사항 추가\n4. API 인터페이스 변경 시 memory/api-types.md 갱신\n5. 프로젝트에 lint 설정이 있다면 lint 결과를 확인하고 문제를 수정"
}
}
RESP
else
echo '{}'
fi
```
`.claude/settings.json`에 hooks 섹션이 없으면 추가 (기존 settings.json의 내용에 병합):
```json
{
"hooks": {
"SessionStart": [
{
"matcher": "compact",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-post-compact.sh",
"timeout": 10
}
]
}
],
"PreCompact": [
{
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-pre-compact.sh",
"timeout": 30
}
]
}
],
"PostToolUse": [
{
"matcher": "Bash",
"hooks": [
{
"type": "command",
"command": "bash .claude/scripts/on-commit.sh",
"timeout": 15
}
]
}
]
}
}
```
### 5. Git Hooks 설정
```bash
git config core.hooksPath .githooks
```
`.githooks/` 디렉토리에 실행 권한 부여:
```bash
chmod +x .githooks/*
```
### 6. 프로젝트 타입별 추가 설정
#### java-maven
- `.sdkmanrc` 생성 (java=17.0.18-amzn 또는 프로젝트에 맞는 버전)
- `.mvn/settings.xml` Nexus 미러 설정 확인
- `mvn compile` 빌드 성공 확인
#### java-gradle
- `.sdkmanrc` 생성
- `gradle.properties.example` Nexus 설정 확인
- `./gradlew compileJava` 빌드 성공 확인
#### react-ts
- `.node-version` 생성 (프로젝트에 맞는 Node 버전)
- `.npmrc` Nexus 레지스트리 설정 확인
- `npm install && npm run build` 성공 확인
### 7. .gitignore 확인
다음 항목이 .gitignore에 포함되어 있는지 확인하고, 없으면 추가:
```
.claude/settings.local.json
.claude/CLAUDE.local.md
.env
.env.*
*.local
```
### 8. Git exclude 설정
`.git/info/exclude` 파일을 읽고, 기존 내용을 보존하면서 하단에 추가:
```gitignore
# Claude Code 워크플로우 (로컬 전용)
docs/CHANGELOG.md
*.tmp
```
### 9. Memory 초기화
프로젝트 memory 디렉토리의 위치를 확인하고 (보통 `~/.claude/projects/<project-hash>/memory/`) 다음 파일들을 생성:
- `memory/MEMORY.md` — 프로젝트 분석 결과 기반 핵심 요약 (200줄 이내)
- 현재 상태, 프로젝트 개요, 기술 스택, 주요 패키지 구조, 상세 참조 링크
- `memory/project-snapshot.md` — 디렉토리 구조, 패키지 구성, 주요 의존성, API 엔드포인트
- `memory/project-history.md` — "초기 팀 워크플로우 구성" 항목으로 시작
- `memory/api-types.md` — 주요 인터페이스/DTO/Entity 타입 요약
- `memory/decisions.md` — 빈 템플릿 (# 의사결정 기록)
- `memory/debugging.md` — 빈 템플릿 (# 디버깅 경험 & 패턴)
### 10. Lint 도구 확인
- TypeScript: eslint, prettier 설치 여부 확인. 미설치 시 사용자에게 설치 제안
- Java: checkstyle, spotless 등 설정 확인
- CLAUDE.md에 lint 실행 명령어가 이미 기록되었는지 확인
### 11. workflow-version.json 생성
Gitea API로 최신 팀 워크플로우 버전을 조회:
```bash
curl -sf --max-time 5 "https://gitea.gc-si.dev/gc/template-common/raw/branch/develop/workflow-version.json"
```
조회 성공 시 해당 `version` 값 사용, 실패 시 "1.0.0" 기본값 사용.
`.claude/workflow-version.json` 파일 생성:
```json
{
"applied_global_version": "<조회된 버전>",
"applied_date": "<현재날짜>",
"project_type": "<감지된타입>",
"gitea_url": "https://gitea.gc-si.dev"
}
```
### 12. 검증 및 요약
- 생성/수정된 파일 목록 출력
- `git config core.hooksPath` 확인
- 빌드 명령 실행 가능 확인
- Hook 스크립트 실행 권한 확인
- 다음 단계 안내:
- 개발 시작, 첫 커밋 방법
- 범용 스킬: `/api-registry`, `/changelog`, `/swagger-spec`

파일 보기

@ -0,0 +1,98 @@
---
name: sync-team-workflow
description: 팀 글로벌 워크플로우를 현재 프로젝트에 동기화합니다
allowed-tools: "Bash, Read, Write, Edit, Glob, Grep"
---
팀 글로벌 워크플로우의 최신 버전을 현재 프로젝트에 적용합니다.
## 수행 절차
### 1. 글로벌 버전 조회
Gitea API로 template-common 리포의 workflow-version.json 조회:
```bash
GITEA_URL=$(python3 -c "import json; print(json.load(open('.claude/workflow-version.json')).get('gitea_url', 'https://gitea.gc-si.dev'))" 2>/dev/null || echo "https://gitea.gc-si.dev")
curl -sf "${GITEA_URL}/gc/template-common/raw/branch/develop/workflow-version.json"
```
### 2. 버전 비교
로컬 `.claude/workflow-version.json``applied_global_version` 필드와 비교:
- 버전 일치 → "최신 버전입니다" 안내 후 종료
- 버전 불일치 → 미적용 변경 항목 추출하여 표시
### 3. 프로젝트 타입 감지
자동 감지 순서:
1. `.claude/workflow-version.json``project_type` 필드 확인
2. 없으면: `pom.xml` → java-maven, `build.gradle` → java-gradle, `package.json` → react-ts
### Gitea 파일 다운로드 URL 패턴
⚠️ Gitea raw 파일은 반드시 **web raw URL**을 사용해야 합니다 (`/api/v1/` 경로 사용 불가):
```bash
GITEA_URL="${GITEA_URL:-https://gitea.gc-si.dev}"
# common 파일: ${GITEA_URL}/gc/template-common/raw/branch/develop/<파일경로>
# 타입별 파일: ${GITEA_URL}/gc/template-<타입>/raw/branch/develop/<파일경로>
# 예시:
curl -sf "${GITEA_URL}/gc/template-common/raw/branch/develop/.claude/rules/team-policy.md"
curl -sf "${GITEA_URL}/gc/template-react-ts/raw/branch/develop/.editorconfig"
```
### 4. 파일 다운로드 및 적용
위의 URL 패턴으로 해당 타입 + common 템플릿 파일 다운로드:
#### 4-1. 규칙 파일 (덮어쓰기)
팀 규칙은 로컬 수정 불가 — 항상 글로벌 최신으로 교체:
```
.claude/rules/team-policy.md
.claude/rules/git-workflow.md
.claude/rules/code-style.md (타입별)
.claude/rules/naming.md (타입별)
.claude/rules/testing.md (타입별)
```
#### 4-2. settings.json (부분 갱신)
- `deny` 목록: 글로벌 최신으로 교체
- `allow` 목록: 기존 사용자 커스텀 유지 + 글로벌 기본값 병합
- `hooks`: init-project SKILL.md의 hooks JSON 블록을 참조하여 교체 (없으면 추가)
- SessionStart(compact) → on-post-compact.sh
- PreCompact → on-pre-compact.sh
- PostToolUse(Bash) → on-commit.sh
#### 4-3. 스킬 파일 (덮어쓰기)
```
.claude/skills/create-mr/SKILL.md
.claude/skills/fix-issue/SKILL.md
.claude/skills/sync-team-workflow/SKILL.md
.claude/skills/init-project/SKILL.md
```
#### 4-4. Git Hooks (덮어쓰기 + 실행 권한)
```bash
chmod +x .githooks/*
```
#### 4-5. Hook 스크립트 갱신
init-project SKILL.md의 코드 블록에서 최신 스크립트를 추출하여 덮어쓰기:
```
.claude/scripts/on-pre-compact.sh
.claude/scripts/on-post-compact.sh
.claude/scripts/on-commit.sh
```
실행 권한 부여: `chmod +x .claude/scripts/*.sh`
### 5. 로컬 버전 업데이트
`.claude/workflow-version.json` 갱신:
```json
{
"applied_global_version": "새버전",
"applied_date": "오늘날짜",
"project_type": "감지된타입",
"gitea_url": "https://gitea.gc-si.dev"
}
```
### 6. 변경 보고
- `git diff`로 변경 내역 확인
- 업데이트된 파일 목록 출력
- 변경 로그(글로벌 workflow-version.json의 changes) 표시
- 필요한 추가 조치 안내 (빌드 확인, 의존성 업데이트 등)

파일 보기

@ -0,0 +1,6 @@
{
"applied_global_version": "1.2.0",
"applied_date": "2026-02-14",
"project_type": "java-maven",
"gitea_url": "https://gitea.gc-si.dev"
}

33
.editorconfig Normal file
파일 보기

@ -0,0 +1,33 @@
root = true
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
[*.{java,kt}]
indent_style = space
indent_size = 4
[*.{js,jsx,ts,tsx,json,yml,yaml,css,scss,html}]
indent_style = space
indent_size = 2
[*.md]
trim_trailing_whitespace = false
[*.{sh,bash}]
indent_style = space
indent_size = 4
[Makefile]
indent_style = tab
[*.{gradle,groovy}]
indent_style = space
indent_size = 4
[*.xml]
indent_style = space
indent_size = 4

60
.githooks/commit-msg Executable file
파일 보기

@ -0,0 +1,60 @@
#!/bin/bash
#==============================================================================
# commit-msg hook
# Conventional Commits 형식 검증 (한/영 혼용 지원)
#==============================================================================
COMMIT_MSG_FILE="$1"
COMMIT_MSG=$(cat "$COMMIT_MSG_FILE")
# Merge 커밋은 검증 건너뜀
if echo "$COMMIT_MSG" | head -1 | grep -qE "^Merge "; then
exit 0
fi
# Revert 커밋은 검증 건너뜀
if echo "$COMMIT_MSG" | head -1 | grep -qE "^Revert "; then
exit 0
fi
# Conventional Commits 정규식
# type(scope): subject
# - type: feat|fix|docs|style|refactor|test|chore|ci|perf (필수)
# - scope: 영문, 숫자, 한글, 점, 밑줄, 하이픈 허용 (선택)
# - subject: 1~72자, 한/영 혼용 허용 (필수)
PATTERN='^(feat|fix|docs|style|refactor|test|chore|ci|perf)(\([a-zA-Z0-9가-힣._-]+\))?: .{1,72}$'
FIRST_LINE=$(head -1 "$COMMIT_MSG_FILE")
if ! echo "$FIRST_LINE" | grep -qE "$PATTERN"; then
echo ""
echo "╔══════════════════════════════════════════════════════════════╗"
echo "║ 커밋 메시지가 Conventional Commits 형식에 맞지 않습니다 ║"
echo "╚══════════════════════════════════════════════════════════════╝"
echo ""
echo " 올바른 형식: type(scope): subject"
echo ""
echo " type (필수):"
echo " feat — 새로운 기능"
echo " fix — 버그 수정"
echo " docs — 문서 변경"
echo " style — 코드 포맷팅"
echo " refactor — 리팩토링"
echo " test — 테스트"
echo " chore — 빌드/설정 변경"
echo " ci — CI/CD 변경"
echo " perf — 성능 개선"
echo ""
echo " scope (선택): 한/영 모두 가능"
echo " subject (필수): 1~72자, 한/영 모두 가능"
echo ""
echo " 예시:"
echo " feat(auth): JWT 기반 로그인 구현"
echo " fix(배치): 야간 배치 타임아웃 수정"
echo " docs: README 업데이트"
echo " chore: Gradle 의존성 업데이트"
echo ""
echo " 현재 메시지: $FIRST_LINE"
echo ""
exit 1
fi

25
.githooks/post-checkout Executable file
파일 보기

@ -0,0 +1,25 @@
#!/bin/bash
#==============================================================================
# post-checkout hook
# 브랜치 체크아웃 시 core.hooksPath 자동 설정
# clone/checkout 후 .githooks 디렉토리가 있으면 자동으로 hooksPath 설정
#==============================================================================
# post-checkout 파라미터: prev_HEAD, new_HEAD, branch_flag
# branch_flag=1: 브랜치 체크아웃, 0: 파일 체크아웃
BRANCH_FLAG="$3"
# 파일 체크아웃은 건너뜀
if [ "$BRANCH_FLAG" = "0" ]; then
exit 0
fi
# .githooks 디렉토리 존재 확인
REPO_ROOT=$(git rev-parse --show-toplevel 2>/dev/null)
if [ -d "${REPO_ROOT}/.githooks" ]; then
CURRENT_HOOKS_PATH=$(git config core.hooksPath 2>/dev/null || echo "")
if [ "$CURRENT_HOOKS_PATH" != ".githooks" ]; then
git config core.hooksPath .githooks
chmod +x "${REPO_ROOT}/.githooks/"* 2>/dev/null
fi
fi

33
.githooks/pre-commit Executable file
파일 보기

@ -0,0 +1,33 @@
#!/bin/bash
#==============================================================================
# pre-commit hook (Java Maven)
# Maven 컴파일 검증 — 컴파일 실패 시 커밋 차단
#==============================================================================
echo "pre-commit: Maven 컴파일 검증 중..."
# Maven Wrapper 사용 (없으면 mvn 사용)
if [ -f "./mvnw" ]; then
MVN="./mvnw"
elif command -v mvn &>/dev/null; then
MVN="mvn"
else
echo "경고: Maven이 설치되지 않았습니다. 컴파일 검증을 건너뜁니다."
exit 0
fi
# 컴파일 검증 (테스트 제외, 오프라인 가능)
$MVN compile -q -DskipTests 2>&1
RESULT=$?
if [ $RESULT -ne 0 ]; then
echo ""
echo "╔══════════════════════════════════════════════════════════╗"
echo "║ 컴파일 실패! 커밋이 차단되었습니다. ║"
echo "║ 컴파일 오류를 수정한 후 다시 커밋해주세요. ║"
echo "╚══════════════════════════════════════════════════════════╝"
echo ""
exit 1
fi
echo "pre-commit: 컴파일 성공"

11
.gitignore vendored
파일 보기

@ -93,13 +93,8 @@ application-local.yml
# Logs
logs/
docs/
*.log.*
# Session continuity files (for AI assistants)
.claude/
CLAUDE.md
BASEREADER_ENHANCEMENT_PLAN.md
README.md
nul
# Claude Code (개인 파일만 무시, 팀 파일은 추적)
.claude/settings.local.json
.claude/scripts/

22
.mvn/settings.xml Normal file
파일 보기

@ -0,0 +1,22 @@
<?xml version="1.0" encoding="UTF-8"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.2.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.2.0
https://maven.apache.org/xsd/settings-1.2.0.xsd">
<servers>
<server>
<id>nexus</id>
<username>admin</username>
<password>Gcsc!8932</password>
</server>
</servers>
<mirrors>
<mirror>
<id>nexus</id>
<name>GC Nexus Repository</name>
<url>https://nexus.gc-si.dev/repository/maven-public/</url>
<mirrorOf>*</mirrorOf>
</mirror>
</mirrors>
</settings>

1
.sdkmanrc Normal file
파일 보기

@ -0,0 +1 @@
java=17.0.18-amzn

101
CLAUDE.md Normal file
파일 보기

@ -0,0 +1,101 @@
# SNP-Batch-1 (snp-batch-validation)
해양 데이터 통합 배치 시스템. 외부 Maritime API에서 선박/항만/사건 데이터를 수집하여 PostgreSQL에 저장하고, AIS 실시간 위치정보를 캐시 기반으로 서비스.
## 기술 스택
- Java 17, Spring Boot 3.2.1, Spring Batch 5.1.0
- PostgreSQL (스키마: t_std_snp_data)
- Quartz Scheduler (JDBC Store)
- Spring Kafka (AIS Target → Kafka 파이프라인)
- WebFlux WebClient (외부 API 호출)
- Thymeleaf (배치 관리 Web GUI)
- Springdoc OpenAPI 2.3.0 (Swagger)
- Caffeine Cache, JTS (공간 연산)
- Lombok, Jackson
## 빌드 & 실행
```bash
# 빌드
sdk use java 17.0.18-amzn
mvn clean package -DskipTests
# 실행
mvn spring-boot:run
# 테스트
mvn test
```
## 서버 설정
- 포트: 8041
- Context Path: /snp-api
- Swagger UI: http://localhost:8041/snp-api/swagger-ui/index.html
## 디렉토리 구조
```
src/main/java/com/snp/batch/
├── SnpBatchApplication.java # 메인 애플리케이션
├── common/ # 공통 프레임워크
│ ├── batch/ # 배치 베이스 클래스 (config, entity, processor, reader, writer)
│ ├── util/ # 유틸 (JsonChangeDetector, SafeGetDataUtil)
│ └── web/ # Web 베이스 (ApiResponse, BaseController, BaseService)
├── global/ # 글로벌 설정 & 배치 관리
│ ├── config/ # AsyncConfig, QuartzConfig, SwaggerConfig, WebClientConfig
│ ├── controller/ # BatchController (/api/batch), WebViewController
│ ├── dto/ # Dashboard, JobExecution, Schedule DTO
│ ├── model/ # BatchLastExecution, JobScheduleEntity
│ ├── partition/ # 파티션 관리 (PartitionManagerTasklet)
│ ├── projection/ # DateRangeProjection
│ └── repository/ # BatchApiLog, BatchLastExecution, JobSchedule, Timeline
├── jobs/ # 배치 Job 모듈 (도메인별)
│ ├── aistarget/ # AIS Target (실시간 위치 + 캐시 + REST API + Kafka 발행)
│ ├── aistargetdbsync/ # AIS Target DB Sync (캐시→DB)
│ ├── common/ # 공통코드 (FlagCode, Stat5Code)
│ ├── compliance/ # 규정준수 (Compliance, CompanyCompliance)
│ ├── event/ # 해양사건 (Event, EventDetail, Cargo, HumanCasualty)
│ ├── movements/ # 선박이동 (다수 하위 Job)
│ ├── psc/ # PSC 검사
│ ├── risk/ # 리스크 분석
│ └── ship*/ # 선박정보 (ship001~ship028, 30+ 테이블)
└── service/ # BatchService, ScheduleService
```
## 배치 Job 패턴
각 Job은 `common/batch/` 베이스 클래스를 상속:
- **BaseJobConfig** → Job/Step 설정 (chunk-oriented)
- **BaseApiReader** → 외부 Maritime API 호출 (WebClient)
- **BaseProcessor** → DTO→Entity 변환
- **BaseWriter** → PostgreSQL Upsert
- **BaseEntity** → 공통 필드 (dataHash, lastModifiedDate 등)
## 주요 API 경로 (context-path: /snp-api)
### Batch Management (/api/batch)
| 메서드 | 경로 | 설명 |
|--------|------|------|
| POST | /jobs/{jobName}/execute | 배치 작업 실행 |
| GET | /jobs | 작업 목록 |
| GET | /jobs/{jobName}/executions | 실행 이력 |
| GET | /executions/{id}/detail | 실행 상세 (Step 포함) |
| POST | /executions/{id}/stop | 실행 중지 |
| GET/POST | /schedules | 스케줄 관리 (CRUD) |
| GET | /dashboard | 대시보드 |
| GET | /timeline | 타임라인 |
### AIS Target (/api/ais-target)
| 메서드 | 경로 | 설명 |
|--------|------|------|
| GET | /{mmsi} | MMSI로 최신 위치 |
| POST | /batch | 다건 MMSI 조회 |
| GET/POST | /search | 시간/공간 범위 검색 |
| POST | /search/filter | 조건 필터 검색 (SOG, COG 등) |
| POST | /search/polygon | 폴리곤 범위 검색 |
| POST | /search/wkt | WKT 형식 검색 |
| GET | /search/with-distance | 거리 포함 원형 검색 |
| GET | /{mmsi}/track | 항적 조회 |
| GET | /cache/stats | 캐시 통계 |
| DELETE | /cache | 캐시 초기화 |
## Lint/Format
- 별도 lint 도구 미설정 (checkstyle, spotless 없음)
- IDE 기본 포매터 사용

파일 크기가 너무 크기때문에 변경 상태를 표시하지 않습니다. Load Diff

파일 보기

@ -1,517 +0,0 @@
# Swagger API 문서화 가이드
**작성일**: 2025-10-16
**버전**: 1.0.0
**프로젝트**: SNP Batch - Spring Batch 기반 데이터 통합 시스템
---
## 📋 Swagger 설정 완료 사항
### ✅ 수정 완료 파일
1. **BaseController.java** - 공통 CRUD Controller 추상 클래스
- Java import alias 오류 수정 (`as SwaggerApiResponse` 제거)
- `@Operation` 어노테이션 내 `responses` 속성으로 통합
- 전체 경로로 어노테이션 사용: `@io.swagger.v3.oas.annotations.responses.ApiResponse`
2. **ProductWebController.java** - 샘플 제품 API Controller
- Java import alias 오류 수정
- 커스텀 엔드포인트 Swagger 어노테이션 수정
3. **SwaggerConfig.java** - Swagger/OpenAPI 3.0 설정
- 서버 포트 동적 설정 (`@Value("${server.port:8081}")`)
- 상세한 API 문서 설명 추가
- Markdown 형식 설명 추가
4. **BatchController.java** - 배치 관리 API (이미 올바르게 구현됨)
---
## 🌐 Swagger UI 접속 정보
### 접속 URL
```
Swagger UI: http://localhost:8081/swagger-ui/index.html
API 문서 (JSON): http://localhost:8081/v3/api-docs
API 문서 (YAML): http://localhost:8081/v3/api-docs.yaml
```
### 제공되는 API 그룹
> **참고**: BaseController는 추상 클래스이므로 별도의 API 그룹으로 표시되지 않습니다.
> 상속받는 Controller(예: ProductWebController)의 `@Tag`로 모든 CRUD 엔드포인트가 그룹화됩니다.
#### 1. **Batch Management API** (`/api/batch`)
배치 작업 실행 및 스케줄 관리
**엔드포인트**:
- `POST /api/batch/jobs/{jobName}/execute` - 배치 작업 실행
- `GET /api/batch/jobs` - 배치 작업 목록 조회
- `GET /api/batch/jobs/{jobName}/executions` - 실행 이력 조회
- `POST /api/batch/executions/{executionId}/stop` - 실행 중지
- `GET /api/batch/schedules` - 스케줄 목록 조회
- `POST /api/batch/schedules` - 스케줄 생성
- `PUT /api/batch/schedules/{jobName}` - 스케줄 수정
- `DELETE /api/batch/schedules/{jobName}` - 스케줄 삭제
- `PATCH /api/batch/schedules/{jobName}/toggle` - 스케줄 활성화/비활성화
- `GET /api/batch/dashboard` - 대시보드 데이터
- `GET /api/batch/timeline` - 타임라인 데이터
#### 2. **Product API** (`/api/products`)
샘플 제품 데이터 CRUD (BaseController 상속)
**모든 엔드포인트가 "Product API" 그룹으로 통합 표시됩니다.**
**공통 CRUD 엔드포인트** (BaseController에서 상속):
- `POST /api/products` - 제품 생성
- `GET /api/products/{id}` - 제품 조회 (ID)
- `GET /api/products` - 전체 제품 조회
- `GET /api/products/page?offset=0&limit=20` - 페이징 조회
- `PUT /api/products/{id}` - 제품 수정
- `DELETE /api/products/{id}` - 제품 삭제
- `GET /api/products/{id}/exists` - 존재 여부 확인
**커스텀 엔드포인트**:
- `GET /api/products/by-product-id/{productId}` - 제품 코드로 조회
- `GET /api/products/stats/active-count` - 활성 제품 개수
---
## 🛠️ 애플리케이션 실행 및 테스트
### 1. 애플리케이션 빌드 및 실행
```bash
# Maven 빌드 (IntelliJ IDEA에서)
mvn clean package -DskipTests
# 애플리케이션 실행
mvn spring-boot:run
```
또는 IntelliJ IDEA에서:
1. `SnpBatchApplication.java` 파일 열기
2. 메인 메서드 왼쪽의 ▶ 아이콘 클릭
3. "Run 'SnpBatchApplication'" 선택
### 2. Swagger UI 접속
브라우저에서 다음 URL 접속:
```
http://localhost:8081/swagger-ui/index.html
```
### 3. API 테스트 예시
#### 예시 1: 배치 작업 목록 조회
```http
GET http://localhost:8081/api/batch/jobs
```
**예상 응답**:
```json
[
"sampleProductImportJob",
"shipDataImportJob"
]
```
#### 예시 2: 배치 작업 실행
```http
POST http://localhost:8081/api/batch/jobs/sampleProductImportJob/execute
```
**예상 응답**:
```json
{
"success": true,
"message": "Job started successfully",
"executionId": 1
}
```
#### 예시 3: 제품 생성 (샘플)
```http
POST http://localhost:8081/api/products
Content-Type: application/json
{
"productId": "TEST-001",
"productName": "테스트 제품",
"category": "Electronics",
"price": 99.99,
"stockQuantity": 50,
"isActive": true,
"rating": 4.5
}
```
**예상 응답**:
```json
{
"success": true,
"message": "Product created successfully",
"data": {
"id": 1,
"productId": "TEST-001",
"productName": "테스트 제품",
"category": "Electronics",
"price": 99.99,
"stockQuantity": 50,
"isActive": true,
"rating": 4.5,
"createdAt": "2025-10-16T10:30:00",
"updatedAt": "2025-10-16T10:30:00"
}
}
```
#### 예시 4: 페이징 조회
```http
GET http://localhost:8081/api/products/page?offset=0&limit=10
```
**예상 응답**:
```json
{
"success": true,
"message": "Retrieved 10 items (total: 100)",
"data": [
{ "id": 1, "productName": "Product 1", ... },
{ "id": 2, "productName": "Product 2", ... },
...
]
}
```
---
## 📚 Swagger 어노테이션 가이드
### BaseController에서 사용된 패턴
#### ❌ 잘못된 사용법 (Java에서는 불가능)
```java
// Kotlin의 import alias는 Java에서 지원되지 않음
import io.swagger.v3.oas.annotations.responses.ApiResponse as SwaggerApiResponse;
@ApiResponses(value = {
@SwaggerApiResponse(responseCode = "200", description = "성공")
})
```
#### ✅ 올바른 사용법 (수정 완료)
```java
// import alias 제거
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.Parameter;
@Operation(
summary = "리소스 생성",
description = "새로운 리소스를 생성합니다",
responses = {
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "200",
description = "생성 성공"
),
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "500",
description = "서버 오류"
)
}
)
@PostMapping
public ResponseEntity<ApiResponse<D>> create(
@Parameter(description = "생성할 리소스 데이터", required = true)
@RequestBody D dto) {
// ...
}
```
### 주요 어노테이션 설명
#### 1. `@Tag` - API 그룹화
```java
@Tag(name = "Product API", description = "제품 관리 API")
public class ProductWebController extends BaseController<ProductWebDto, Long> {
// ...
}
```
#### 2. `@Operation` - 엔드포인트 문서화
```java
@Operation(
summary = "짧은 설명 (목록에 표시)",
description = "상세 설명 (확장 시 표시)",
responses = { /* 응답 정의 */ }
)
```
#### 3. `@Parameter` - 파라미터 설명
```java
@Parameter(
description = "파라미터 설명",
required = true,
example = "예시 값"
)
@PathVariable String id
```
#### 4. `@io.swagger.v3.oas.annotations.responses.ApiResponse` - 응답 정의
```java
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "200",
description = "성공 메시지",
content = @Content(
mediaType = "application/json",
schema = @Schema(implementation = ProductDto.class)
)
)
```
---
## 🎯 신규 Controller 개발 시 Swagger 적용 가이드
### 1. BaseController를 상속하는 경우
```java
@RestController
@RequestMapping("/api/myresource")
@RequiredArgsConstructor
@Tag(name = "My Resource API", description = "나의 리소스 관리 API")
public class MyResourceController extends BaseController<MyResourceDto, Long> {
private final MyResourceService myResourceService;
@Override
protected BaseService<?, MyResourceDto, Long> getService() {
return myResourceService;
}
@Override
protected String getResourceName() {
return "MyResource";
}
// BaseController가 제공하는 CRUD 엔드포인트 자동 생성:
// POST /api/myresource
// GET /api/myresource/{id}
// GET /api/myresource
// GET /api/myresource/page
// PUT /api/myresource/{id}
// DELETE /api/myresource/{id}
// GET /api/myresource/{id}/exists
// 커스텀 엔드포인트 추가 시:
@Operation(
summary = "커스텀 조회",
description = "특정 조건으로 리소스를 조회합니다",
responses = {
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "200",
description = "조회 성공"
)
}
)
@GetMapping("/custom/{key}")
public ResponseEntity<ApiResponse<MyResourceDto>> customEndpoint(
@Parameter(description = "커스텀 키", required = true)
@PathVariable String key) {
// 구현...
}
}
```
### 2. 독립적인 Controller를 작성하는 경우
```java
@RestController
@RequestMapping("/api/custom")
@RequiredArgsConstructor
@Slf4j
@Tag(name = "Custom API", description = "커스텀 API")
public class CustomController {
@Operation(
summary = "커스텀 작업",
description = "특정 작업을 수행합니다",
responses = {
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "200",
description = "작업 성공"
),
@io.swagger.v3.oas.annotations.responses.ApiResponse(
responseCode = "500",
description = "서버 오류"
)
}
)
@PostMapping("/action")
public ResponseEntity<Map<String, Object>> customAction(
@Parameter(description = "액션 파라미터", required = true)
@RequestBody Map<String, String> params) {
// 구현...
}
}
```
---
## 🔍 Swagger UI 화면 구성
### 메인 화면
```
┌─────────────────────────────────────────────────┐
│ SNP Batch REST API │
│ Version: v1.0.0 │
│ Spring Batch 기반 데이터 통합 시스템 REST API │
├─────────────────────────────────────────────────┤
│ Servers: │
│ ▼ http://localhost:8081 (로컬 개발 서버) │
├─────────────────────────────────────────────────┤
│ │
│ ▼ Batch Management API │
│ POST /api/batch/jobs/{jobName}/execute │
│ GET /api/batch/jobs │
│ ... │
│ │
│ ▼ Product API (9개 엔드포인트 통합 표시) │
│ POST /api/products │
│ GET /api/products/{id} │
│ GET /api/products │
│ GET /api/products/page │
│ PUT /api/products/{id} │
│ DELETE /api/products/{id} │
│ GET /api/products/{id}/exists │
│ GET /api/products/by-product-id/{...} │
│ GET /api/products/stats/active-count │
│ │
│ (Base API 그룹은 표시되지 않음) │
│ │
└─────────────────────────────────────────────────┘
```
### API 실행 화면 예시
각 엔드포인트 클릭 시:
- **Parameters**: 파라미터 입력 필드
- **Request body**: JSON 요청 본문 에디터
- **Try it out**: 실제 API 호출 버튼
- **Responses**: 응답 코드 및 예시
- **Curl**: curl 명령어 생성
---
## ⚠️ 문제 해결
### 1. Swagger UI 접속 불가
**증상**: `http://localhost:8081/swagger-ui/index.html` 접속 시 404 오류
**해결**:
1. 애플리케이션이 실행 중인지 확인
2. 포트 번호 확인 (`application.yml`의 `server.port`)
3. 다음 URL 시도:
- `http://localhost:8081/swagger-ui.html`
- `http://localhost:8081/swagger-ui/`
### 2. API 실행 시 401/403 오류
**증상**: "Try it out" 클릭 시 인증 오류
**해결**:
- 현재 인증이 설정되지 않음 (기본 허용)
- Spring Security 추가 시 Swagger 경로 허용 필요:
```java
.authorizeHttpRequests(auth -> auth
.requestMatchers("/swagger-ui/**", "/v3/api-docs/**").permitAll()
.anyRequest().authenticated()
)
```
### 3. 특정 엔드포인트가 보이지 않음
**증상**: Controller는 작성했지만 Swagger UI에 표시되지 않음
**해결**:
1. `@RestController` 어노테이션 확인
2. `@RequestMapping` 경로 확인
3. Controller가 `com.snp.batch` 패키지 하위에 있는지 확인
4. 애플리케이션 재시작
---
## 📊 설정 파일
### application.yml (Swagger 관련 설정)
```yaml
server:
port: 8081 # Swagger UI 접속 포트
# Springdoc OpenAPI 설정 (필요 시 추가)
springdoc:
api-docs:
path: /v3/api-docs # OpenAPI JSON 경로
swagger-ui:
path: /swagger-ui.html # Swagger UI 경로
enabled: true
operations-sorter: alpha # 엔드포인트 정렬 (alpha, method)
tags-sorter: alpha # 태그 정렬
```
---
## 🎓 추가 학습 자료
### Swagger 어노테이션 공식 문서
- [OpenAPI 3.0 Annotations](https://github.com/swagger-api/swagger-core/wiki/Swagger-2.X---Annotations)
- [Springdoc OpenAPI](https://springdoc.org/)
### 관련 파일 위치
```
src/main/java/com/snp/batch/
├── common/web/controller/BaseController.java # 공통 CRUD Base
├── global/config/SwaggerConfig.java # Swagger 설정
├── global/controller/BatchController.java # Batch API
└── jobs/sample/web/controller/ProductWebController.java # Product API
```
---
## ✅ 체크리스트
애플리케이션 실행 전 확인:
- [ ] Maven 빌드 성공
- [ ] `application.yml` 설정 확인
- [ ] PostgreSQL 데이터베이스 연결 확인
- [ ] 포트 8081 사용 가능 여부 확인
Swagger 테스트 확인:
- [ ] Swagger UI 접속 성공
- [ ] Batch Management API 표시 확인
- [ ] Product API 표시 확인
- [ ] "Try it out" 기능 동작 확인
- [ ] API 응답 정상 확인
---
## 📚 관련 문서
### 핵심 문서
- **[README.md](README.md)** - 프로젝트 개요 및 빠른 시작 가이드
- **[DEVELOPMENT_GUIDE.md](DEVELOPMENT_GUIDE.md)** - 신규 Job 개발 가이드 및 Base 클래스 사용법
- **[CLAUDE.md](CLAUDE.md)** - 프로젝트 형상관리 문서 (세션 연속성)
### 아키텍처 문서
- **[docs/architecture/ARCHITECTURE.md](docs/architecture/ARCHITECTURE.md)** - 프로젝트 아키텍처 상세 설계
- **[docs/architecture/PROJECT_STRUCTURE.md](docs/architecture/PROJECT_STRUCTURE.md)** - Job 중심 패키지 구조 가이드
### 구현 가이드
- **[docs/guides/PROXY_SERVICE_GUIDE.md](docs/guides/PROXY_SERVICE_GUIDE.md)** - 외부 API 프록시 패턴 구현 가이드
- **[docs/guides/SHIP_API_EXAMPLE.md](docs/guides/SHIP_API_EXAMPLE.md)** - Maritime API 연동 실전 예제
### 보안 문서
- **[docs/security/README.md](docs/security/README.md)** - 보안 전략 개요 (계획 단계)
---
**최종 업데이트**: 2025-10-16
**작성자**: Claude Code
**버전**: 1.1.0

파일 보기

@ -111,6 +111,12 @@
<version>2.3.0</version>
</dependency>
<!-- Kafka -->
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<!-- Caffeine Cache -->
<dependency>
<groupId>com.github.ben-manes.caffeine</groupId>

파일 보기

@ -29,9 +29,23 @@ public abstract class BaseJdbcRepository<T, ID> {
protected final JdbcTemplate jdbcTemplate;
/**
* 테이블명 반환 (하위 클래스에서 구현)
* 대상 스키마 이름 반환 (하위 클래스에서 구현)
* application.yml의 app.batch.target-schema.name 값을 @Value로 주입받아 반환
*/
protected abstract String getTableName();
protected abstract String getTargetSchema();
/**
* 테이블명만 반환 (스키마 제외, 하위 클래스에서 구현)
*/
protected abstract String getSimpleTableName();
/**
* 전체 테이블명 반환 (스키마.테이블)
* 하위 클래스에서는 getSimpleTableName() 구현하면
*/
protected String getTableName() {
return getTargetSchema() + "." + getSimpleTableName();
}
/**
* ID 컬럼명 반환 (기본값: "id")

파일 보기

@ -7,7 +7,7 @@ import org.hibernate.annotations.CreationTimestamp;
import java.time.LocalDateTime;
@Entity
@Table(name = "batch_api_log", schema = "t_snp_data")
@Table(name = "batch_api_log", schema = "t_std_snp_data")
@Getter
@NoArgsConstructor(access = AccessLevel.PROTECTED)
@AllArgsConstructor

파일 보기

@ -1,8 +1,8 @@
package com.snp.batch.jobs.aistarget.batch.repository;
import com.snp.batch.jobs.aistarget.batch.entity.AisTargetEntity;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -19,104 +19,111 @@ import java.util.Optional;
/**
* AIS Target Repository 구현체
*
* 테이블: snp_data.ais_target
* 테이블: {targetSchema}.ais_target
* PK: mmsi + message_timestamp (복합키)
*/
@Slf4j
@Repository
@RequiredArgsConstructor
public class AisTargetRepositoryImpl implements AisTargetRepository {
private final JdbcTemplate jdbcTemplate;
private final String tableName;
private final String upsertSql;
private static final String TABLE_NAME = "snp_data.ais_target";
public AisTargetRepositoryImpl(JdbcTemplate jdbcTemplate,
@Value("${app.batch.target-schema.name}") String targetSchema) {
this.jdbcTemplate = jdbcTemplate;
this.tableName = targetSchema + ".ais_target";
this.upsertSql = buildUpsertSql(targetSchema);
}
// ==================== UPSERT SQL ====================
private String buildUpsertSql(String schema) {
return """
INSERT INTO %s.ais_target (
mmsi, message_timestamp, imo, name, callsign, vessel_type, extra_info,
lat, lon, geom,
heading, sog, cog, rot,
length, width, draught, length_bow, length_stern, width_port, width_starboard,
destination, eta, status,
age_minutes, position_accuracy, timestamp_utc, repeat_indicator, raim_flag,
radio_status, regional, regional2, spare, spare2,
ais_version, position_fix_type, dte, band_flag,
received_date, collected_at, created_at, updated_at,
tonnes_cargo, in_sts, on_berth, dwt, anomalous,
destination_port_id, destination_tidied, destination_unlocode, imo_verified, last_static_update_received,
lpc_code, message_type, "source", station_id, zone_id
) VALUES (
?, ?, ?, ?, ?, ?, ?,
?, ?, ST_SetSRID(ST_MakePoint(?, ?), 4326),
?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?,
?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?,
?, ?, NOW(), NOW(),
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?
)
ON CONFLICT (mmsi, message_timestamp) DO UPDATE SET
imo = EXCLUDED.imo,
name = EXCLUDED.name,
callsign = EXCLUDED.callsign,
vessel_type = EXCLUDED.vessel_type,
extra_info = EXCLUDED.extra_info,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
geom = EXCLUDED.geom,
heading = EXCLUDED.heading,
sog = EXCLUDED.sog,
cog = EXCLUDED.cog,
rot = EXCLUDED.rot,
length = EXCLUDED.length,
width = EXCLUDED.width,
draught = EXCLUDED.draught,
length_bow = EXCLUDED.length_bow,
length_stern = EXCLUDED.length_stern,
width_port = EXCLUDED.width_port,
width_starboard = EXCLUDED.width_starboard,
destination = EXCLUDED.destination,
eta = EXCLUDED.eta,
status = EXCLUDED.status,
age_minutes = EXCLUDED.age_minutes,
position_accuracy = EXCLUDED.position_accuracy,
timestamp_utc = EXCLUDED.timestamp_utc,
repeat_indicator = EXCLUDED.repeat_indicator,
raim_flag = EXCLUDED.raim_flag,
radio_status = EXCLUDED.radio_status,
regional = EXCLUDED.regional,
regional2 = EXCLUDED.regional2,
spare = EXCLUDED.spare,
spare2 = EXCLUDED.spare2,
ais_version = EXCLUDED.ais_version,
position_fix_type = EXCLUDED.position_fix_type,
dte = EXCLUDED.dte,
band_flag = EXCLUDED.band_flag,
received_date = EXCLUDED.received_date,
collected_at = EXCLUDED.collected_at,
updated_at = NOW(),
tonnes_cargo = EXCLUDED.tonnes_cargo,
in_sts = EXCLUDED.in_sts,
on_berth = EXCLUDED.on_berth,
dwt = EXCLUDED.dwt,
anomalous = EXCLUDED.anomalous,
destination_port_id = EXCLUDED.destination_port_id,
destination_tidied = EXCLUDED.destination_tidied,
destination_unlocode = EXCLUDED.destination_unlocode,
imo_verified = EXCLUDED.imo_verified,
last_static_update_received = EXCLUDED.last_static_update_received,
lpc_code = EXCLUDED.lpc_code,
message_type = EXCLUDED.message_type,
"source" = EXCLUDED."source",
station_id = EXCLUDED.station_id,
zone_id = EXCLUDED.zone_id
""".formatted(schema);
}
private static final String UPSERT_SQL = """
INSERT INTO snp_data.ais_target (
mmsi, message_timestamp, imo, name, callsign, vessel_type, extra_info,
lat, lon, geom,
heading, sog, cog, rot,
length, width, draught, length_bow, length_stern, width_port, width_starboard,
destination, eta, status,
age_minutes, position_accuracy, timestamp_utc, repeat_indicator, raim_flag,
radio_status, regional, regional2, spare, spare2,
ais_version, position_fix_type, dte, band_flag,
received_date, collected_at, created_at, updated_at,
tonnes_cargo, in_sts, on_berth, dwt, anomalous,
destination_port_id, destination_tidied, destination_unlocode, imo_verified, last_static_update_received,
lpc_code, message_type, "source", station_id, zone_id
) VALUES (
?, ?, ?, ?, ?, ?, ?,
?, ?, ST_SetSRID(ST_MakePoint(?, ?), 4326),
?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?,
?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?,
?, ?, NOW(), NOW(),
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?
)
ON CONFLICT (mmsi, message_timestamp) DO UPDATE SET
imo = EXCLUDED.imo,
name = EXCLUDED.name,
callsign = EXCLUDED.callsign,
vessel_type = EXCLUDED.vessel_type,
extra_info = EXCLUDED.extra_info,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
geom = EXCLUDED.geom,
heading = EXCLUDED.heading,
sog = EXCLUDED.sog,
cog = EXCLUDED.cog,
rot = EXCLUDED.rot,
length = EXCLUDED.length,
width = EXCLUDED.width,
draught = EXCLUDED.draught,
length_bow = EXCLUDED.length_bow,
length_stern = EXCLUDED.length_stern,
width_port = EXCLUDED.width_port,
width_starboard = EXCLUDED.width_starboard,
destination = EXCLUDED.destination,
eta = EXCLUDED.eta,
status = EXCLUDED.status,
age_minutes = EXCLUDED.age_minutes,
position_accuracy = EXCLUDED.position_accuracy,
timestamp_utc = EXCLUDED.timestamp_utc,
repeat_indicator = EXCLUDED.repeat_indicator,
raim_flag = EXCLUDED.raim_flag,
radio_status = EXCLUDED.radio_status,
regional = EXCLUDED.regional,
regional2 = EXCLUDED.regional2,
spare = EXCLUDED.spare,
spare2 = EXCLUDED.spare2,
ais_version = EXCLUDED.ais_version,
position_fix_type = EXCLUDED.position_fix_type,
dte = EXCLUDED.dte,
band_flag = EXCLUDED.band_flag,
received_date = EXCLUDED.received_date,
collected_at = EXCLUDED.collected_at,
updated_at = NOW(),
tonnes_cargo = EXCLUDED.tonnes_cargo,
in_sts = EXCLUDED.in_sts,
on_berth = EXCLUDED.on_berth,
dwt = EXCLUDED.dwt,
anomalous = EXCLUDED.anomalous,
destination_port_id = EXCLUDED.destination_port_id,
destination_tidied = EXCLUDED.destination_tidied,
destination_unlocode = EXCLUDED.destination_unlocode,
imo_verified = EXCLUDED.imo_verified,
last_static_update_received = EXCLUDED.last_static_update_received,
lpc_code = EXCLUDED.lpc_code,
message_type = EXCLUDED.message_type,
"source" = EXCLUDED."source",
station_id = EXCLUDED.station_id,
zone_id = EXCLUDED.zone_id
""";
// ==================== RowMapper ====================
@ -181,7 +188,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
@Override
public Optional<AisTargetEntity> findByMmsiAndMessageTimestamp(Long mmsi, OffsetDateTime messageTimestamp) {
String sql = "SELECT * FROM " + TABLE_NAME + " WHERE mmsi = ? AND message_timestamp = ?";
String sql = "SELECT * FROM " + tableName + " WHERE mmsi = ? AND message_timestamp = ?";
List<AisTargetEntity> results = jdbcTemplate.query(sql, rowMapper, mmsi, toTimestamp(messageTimestamp));
return results.isEmpty() ? Optional.empty() : Optional.of(results.get(0));
}
@ -193,7 +200,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
WHERE mmsi = ?
ORDER BY message_timestamp DESC
LIMIT 1
""".formatted(TABLE_NAME);
""".formatted(tableName);
List<AisTargetEntity> results = jdbcTemplate.query(sql, rowMapper, mmsi);
return results.isEmpty() ? Optional.empty() : Optional.of(results.get(0));
}
@ -210,7 +217,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
FROM %s
WHERE mmsi = ANY(?)
ORDER BY mmsi, message_timestamp DESC
""".formatted(TABLE_NAME);
""".formatted(tableName);
Long[] mmsiArray = mmsiList.toArray(new Long[0]);
return jdbcTemplate.query(sql, rowMapper, (Object) mmsiArray);
@ -223,7 +230,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
WHERE mmsi = ?
AND message_timestamp BETWEEN ? AND ?
ORDER BY message_timestamp ASC
""".formatted(TABLE_NAME);
""".formatted(tableName);
return jdbcTemplate.query(sql, rowMapper, mmsi, toTimestamp(start), toTimestamp(end));
}
@ -245,7 +252,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
?
)
ORDER BY mmsi, message_timestamp DESC
""".formatted(TABLE_NAME);
""".formatted(tableName);
return jdbcTemplate.query(sql, rowMapper,
toTimestamp(start), toTimestamp(end),
@ -261,7 +268,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
log.info("AIS Target 배치 UPSERT 시작: {} 건", entities.size());
jdbcTemplate.batchUpdate(UPSERT_SQL, entities, 1000, (ps, entity) -> {
jdbcTemplate.batchUpdate(upsertSql, entities, 1000, (ps, entity) -> {
int idx = 1;
// PK
ps.setLong(idx++, entity.getMmsi());
@ -336,7 +343,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
@Override
public long count() {
String sql = "SELECT COUNT(*) FROM " + TABLE_NAME;
String sql = "SELECT COUNT(*) FROM " + tableName;
Long count = jdbcTemplate.queryForObject(sql, Long.class);
return count != null ? count : 0L;
}
@ -344,7 +351,7 @@ public class AisTargetRepositoryImpl implements AisTargetRepository {
@Override
@Transactional
public int deleteOlderThan(OffsetDateTime threshold) {
String sql = "DELETE FROM " + TABLE_NAME + " WHERE message_timestamp < ?";
String sql = "DELETE FROM " + tableName + " WHERE message_timestamp < ?";
int deleted = jdbcTemplate.update(sql, toTimestamp(threshold));
log.info("AIS Target 오래된 데이터 삭제 완료: {} 건 (기준: {})", deleted, threshold);
return deleted;

파일 보기

@ -4,6 +4,7 @@ import com.snp.batch.common.batch.writer.BaseWriter;
import com.snp.batch.jobs.aistarget.batch.entity.AisTargetEntity;
import com.snp.batch.jobs.aistarget.cache.AisTargetCacheManager;
import com.snp.batch.jobs.aistarget.classifier.AisClassTypeClassifier;
import com.snp.batch.jobs.aistarget.kafka.AisTargetKafkaProducer;
import lombok.extern.slf4j.Slf4j;
import org.springframework.stereotype.Component;
@ -15,10 +16,11 @@ import java.util.List;
* 동작:
* 1. ClassType 분류 (Core20 캐시 기반 A/B 분류)
* 2. 캐시에 최신 위치 정보 업데이트 (classType, core20Mmsi 포함)
* 3. Kafka 토픽으로 AIS Target 정보 전송 (서브청크 분할)
*
* 참고:
* - DB 저장은 별도 Job(aisTargetDbSyncJob)에서 15분 주기로 수행
* - Writer는 캐시 업데이트만 담당
* - Kafka 전송 실패는 기본적으로 로그만 남기고 다음 처리 계속
*/
@Slf4j
@Component
@ -26,13 +28,16 @@ public class AisTargetDataWriter extends BaseWriter<AisTargetEntity> {
private final AisTargetCacheManager cacheManager;
private final AisClassTypeClassifier classTypeClassifier;
private final AisTargetKafkaProducer kafkaProducer;
public AisTargetDataWriter(
AisTargetCacheManager cacheManager,
AisClassTypeClassifier classTypeClassifier) {
AisClassTypeClassifier classTypeClassifier,
AisTargetKafkaProducer kafkaProducer) {
super("AisTarget");
this.cacheManager = cacheManager;
this.classTypeClassifier = classTypeClassifier;
this.kafkaProducer = kafkaProducer;
}
@Override
@ -48,5 +53,19 @@ public class AisTargetDataWriter extends BaseWriter<AisTargetEntity> {
log.debug("AIS Target 캐시 업데이트 완료: {} 건 (캐시 크기: {})",
items.size(), cacheManager.size());
// 3. Kafka 전송 (설정 enabled=true 경우)
if (!kafkaProducer.isEnabled()) {
log.debug("AIS Kafka 전송 비활성화 - topic 전송 스킵");
return;
}
AisTargetKafkaProducer.PublishSummary summary = kafkaProducer.publish(items);
log.info("AIS Kafka 전송 완료 - topic: {}, 요청: {}, 성공: {}, 실패: {}, 스킵: {}",
kafkaProducer.getTopic(),
summary.getRequestedCount(),
summary.getSuccessCount(),
summary.getFailedCount(),
summary.getSkippedCount());
}
}

파일 보기

@ -0,0 +1,55 @@
package com.snp.batch.jobs.aistarget.kafka;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.snp.batch.jobs.aistarget.batch.entity.AisTargetEntity;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.time.OffsetDateTime;
import java.time.ZoneOffset;
/**
* AIS Target Kafka 메시지 스키마
*/
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
@JsonInclude(JsonInclude.Include.NON_NULL)
public class AisTargetKafkaMessage {
/**
* 이벤트 고유 식별자
* - 형식: {mmsi}_{messageTimestamp}
*/
private String eventId;
/**
* Kafka key와 동일한 선박 식별자
*/
private String key;
/**
* Kafka 발행 시각(UTC)
*/
private OffsetDateTime publishedAt;
/**
* AIS 원본/가공 데이터 전체 필드
*/
private AisTargetEntity payload;
public static AisTargetKafkaMessage from(AisTargetEntity entity) {
String key = entity.getMmsi() != null ? String.valueOf(entity.getMmsi()) : null;
String messageTs = entity.getMessageTimestamp() != null ? entity.getMessageTimestamp().toString() : "null";
return AisTargetKafkaMessage.builder()
.eventId(key + "_" + messageTs)
.key(key)
.publishedAt(OffsetDateTime.now(ZoneOffset.UTC))
.payload(entity)
.build();
}
}

파일 보기

@ -0,0 +1,207 @@
package com.snp.batch.jobs.aistarget.kafka;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.jobs.aistarget.batch.entity.AisTargetEntity;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Component;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.atomic.AtomicInteger;
/**
* AIS Target Kafka Producer
*
* 정책:
* - key: MMSI
* - value: AisTargetKafkaMessage(JSON)
* - 실패 기본적으로 로그만 남기고 계속 진행 (failOnSendError=false)
*/
@Slf4j
@Component
@RequiredArgsConstructor
public class AisTargetKafkaProducer {
private final KafkaTemplate<String, String> kafkaTemplate;
private final ObjectMapper objectMapper;
private final AisTargetKafkaProperties kafkaProperties;
public boolean isEnabled() {
return kafkaProperties.isEnabled();
}
public String getTopic() {
return kafkaProperties.getTopic();
}
/**
* 수집 청크 데이터를 Kafka 전송용 서브청크로 분할해 전송
*/
public PublishSummary publish(List<AisTargetEntity> entities) {
if (!isEnabled()) {
return PublishSummary.disabled();
}
if (entities == null || entities.isEmpty()) {
return PublishSummary.empty();
}
int subChunkSize = Math.max(1, kafkaProperties.getSendChunkSize());
PublishSummary totalSummary = PublishSummary.empty();
for (int from = 0; from < entities.size(); from += subChunkSize) {
int to = Math.min(from + subChunkSize, entities.size());
List<AisTargetEntity> subChunk = entities.subList(from, to);
PublishSummary chunkSummary = publishSubChunk(subChunk);
totalSummary.merge(chunkSummary);
log.info("AIS Kafka 서브청크 전송 완료 - topic: {}, 범위: {}~{}, 요청: {}, 성공: {}, 실패: {}, 스킵: {}",
getTopic(), from, to - 1,
chunkSummary.getRequestedCount(),
chunkSummary.getSuccessCount(),
chunkSummary.getFailedCount(),
chunkSummary.getSkippedCount());
}
if (kafkaProperties.isFailOnSendError() && totalSummary.getFailedCount() > 0) {
throw new IllegalStateException("AIS Kafka 전송 실패 건수: " + totalSummary.getFailedCount());
}
return totalSummary;
}
private PublishSummary publishSubChunk(List<AisTargetEntity> subChunk) {
AtomicInteger successCount = new AtomicInteger(0);
AtomicInteger failedCount = new AtomicInteger(0);
AtomicInteger skippedCount = new AtomicInteger(0);
AtomicInteger sampledErrorLogs = new AtomicInteger(0);
List<CompletableFuture<Void>> futures = new ArrayList<>(subChunk.size());
for (AisTargetEntity entity : subChunk) {
if (!isValid(entity)) {
skippedCount.incrementAndGet();
continue;
}
try {
String key = String.valueOf(entity.getMmsi());
String payload = objectMapper.writeValueAsString(AisTargetKafkaMessage.from(entity));
CompletableFuture<Void> trackedFuture = kafkaTemplate.send(getTopic(), key, payload)
.handle((result, ex) -> {
if (ex != null) {
failedCount.incrementAndGet();
logSendError(sampledErrorLogs,
"AIS Kafka 전송 실패 - topic: " + getTopic()
+ ", key: " + key
+ ", messageTimestamp: " + entity.getMessageTimestamp()
+ ", error: " + ex.getMessage());
} else {
successCount.incrementAndGet();
}
return null;
});
futures.add(trackedFuture);
} catch (JsonProcessingException e) {
failedCount.incrementAndGet();
logSendError(sampledErrorLogs,
"AIS Kafka 메시지 직렬화 실패 - mmsi: " + entity.getMmsi()
+ ", messageTimestamp: " + entity.getMessageTimestamp()
+ ", error: " + e.getMessage());
} catch (Exception e) {
failedCount.incrementAndGet();
logSendError(sampledErrorLogs,
"AIS Kafka 전송 요청 실패 - mmsi: " + entity.getMmsi()
+ ", messageTimestamp: " + entity.getMessageTimestamp()
+ ", error: " + e.getMessage());
}
}
if (!futures.isEmpty()) {
CompletableFuture.allOf(futures.toArray(new CompletableFuture[0])).join();
kafkaTemplate.flush();
}
return PublishSummary.of(
false,
subChunk.size(),
successCount.get(),
failedCount.get(),
skippedCount.get()
);
}
private boolean isValid(AisTargetEntity entity) {
return entity != null
&& entity.getMmsi() != null
&& entity.getMessageTimestamp() != null;
}
private void logSendError(AtomicInteger sampledErrorLogs, String message) {
int current = sampledErrorLogs.incrementAndGet();
if (current <= 5) {
log.error(message);
return;
}
if (current == 6) {
log.error("AIS Kafka 전송 오류 로그가 많아 이후 상세 로그는 생략합니다.");
}
}
@Getter
public static class PublishSummary {
private final boolean disabled;
private int requestedCount;
private int successCount;
private int failedCount;
private int skippedCount;
private PublishSummary(
boolean disabled,
int requestedCount,
int successCount,
int failedCount,
int skippedCount
) {
this.disabled = disabled;
this.requestedCount = requestedCount;
this.successCount = successCount;
this.failedCount = failedCount;
this.skippedCount = skippedCount;
}
public static PublishSummary disabled() {
return of(true, 0, 0, 0, 0);
}
public static PublishSummary empty() {
return of(false, 0, 0, 0, 0);
}
public static PublishSummary of(
boolean disabled,
int requestedCount,
int successCount,
int failedCount,
int skippedCount
) {
return new PublishSummary(disabled, requestedCount, successCount, failedCount, skippedCount);
}
public void merge(PublishSummary other) {
this.requestedCount += other.requestedCount;
this.successCount += other.successCount;
this.failedCount += other.failedCount;
this.skippedCount += other.skippedCount;
}
}
}

파일 보기

@ -0,0 +1,36 @@
package com.snp.batch.jobs.aistarget.kafka;
import lombok.Getter;
import lombok.Setter;
import org.springframework.boot.context.properties.ConfigurationProperties;
/**
* AIS Target Kafka 전송 설정
*/
@Getter
@Setter
@ConfigurationProperties(prefix = "app.batch.ais-target.kafka")
public class AisTargetKafkaProperties {
/**
* Kafka 전송 활성화 여부
*/
private boolean enabled = true;
/**
* 전송 대상 토픽
*/
private String topic = "tp_SNP_AIS_Signal";
/**
* Kafka 전송 서브청크 크기
* 수집 청크(: 5만) 별도로 전송 배치를 분할한다.
*/
private int sendChunkSize = 5000;
/**
* 전송 실패 Step 실패 여부
* false면 실패 로그만 남기고 다음 처리를 계속한다.
*/
private boolean failOnSendError = false;
}

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.common.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.common.batch.entity.FlagCodeEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -15,18 +16,29 @@ import java.util.List;
@Repository("FlagCodeRepository")
public class FlagCodeRepositoryImpl extends BaseJdbcRepository<FlagCodeEntity, String> implements FlagCodeRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.code-002}")
private String tableName;
public FlagCodeRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getEntityName() {
return "FlagCodeEntity";
}
@Override
protected String getTableName() {
return "t_snp_data.flagcode";
protected String getSimpleTableName() {
return tableName;
}
@ -39,8 +51,8 @@ public class FlagCodeRepositoryImpl extends BaseJdbcRepository<FlagCodeEntity, S
protected String getUpdateSql() {
return """
INSERT INTO %s(
datasetversion, code, decode, iso2, iso3,
job_execution_id, created_by
dataset_ver, ship_country_cd, cd_nm, iso_two_cd, iso_thr_cd,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.common.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.common.batch.entity.Stat5CodeEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -14,18 +15,30 @@ import java.util.List;
@Slf4j
@Repository("Stat5CodeRepository")
public class Stat5CodeRepositoryImpl extends BaseJdbcRepository<Stat5CodeEntity, String> implements Stat5CodeRepository{
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.code-001}")
private String tableName;
public Stat5CodeRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getEntityName() {
return "Stat5CodeEntity";
}
@Override
protected String getTableName() {
return "t_snp_data.stat5code";
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -47,8 +60,8 @@ public class Stat5CodeRepositoryImpl extends BaseJdbcRepository<Stat5CodeEntity,
protected String getUpdateSql() {
return """
INSERT INTO %s(
level1, level1decode, level2, level2decode, level3, level3decode, level4, level4decode, level5, level5decode, description, release,
job_execution_id, created_by
lv_one, lv_one_desc, lv_two, lv_two_desc, lv_thr, lv_thr_desc, lv_four, lv_four_desc, lv_five, lv_five_desc, dtl_desc, rls_iem,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -47,9 +47,12 @@ public class CompanyComplianceImportRangeJobConfig extends BaseMultiStepJobConfi
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "COMPANY_COMPLIANCE_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
@Override
protected int getChunkSize() {
@ -157,7 +160,8 @@ public class CompanyComplianceImportRangeJobConfig extends BaseMultiStepJobConfi
log.info("Company Compliance History Value Change Manage 프로시저 변수 (KST 변환): 시작일: {}, 종료일: {}", startDt, endDt);
// 3. 프로시저 호출 (안전한 파라미터 바인딩 권장)
jdbcTemplate.update("CALL new_snp.company_compliance_history_value_change_manage(CAST(? AS TIMESTAMP), CAST(? AS TIMESTAMP))", startDt, endDt);
String procedureCall = String.format("CALL %s.company_compliance_history_value_change_manage(CAST(? AS TIMESTAMP), CAST(? AS TIMESTAMP))", targetSchema);
jdbcTemplate.update(procedureCall, startDt, endDt);
log.info(">>>>> Company Compliance History Value Change Manage 프로시저 호출 완료");
return RepeatStatus.FINISHED;

파일 보기

@ -14,6 +14,7 @@ import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
@ -30,6 +31,9 @@ public class ComplianceImportJobConfig extends BaseJobConfig<ComplianceDto, Comp
private final ComplianceDataWriter complianceDataWriter;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Override
protected int getChunkSize() {
return 5000; // API에서 5000개씩 가져오므로 chunk도 5000으로 설정
@ -60,7 +64,7 @@ public class ComplianceImportJobConfig extends BaseJobConfig<ComplianceDto, Comp
@Override
protected ItemReader<ComplianceDto> createReader() {
return new ComplianceDataReader(maritimeServiceApiWebClient, jdbcTemplate);
return new ComplianceDataReader(maritimeServiceApiWebClient, jdbcTemplate, targetSchema);
}
@Override

파일 보기

@ -46,9 +46,12 @@ public class ComplianceImportRangeJobConfig extends BaseMultiStepJobConfig<Compl
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "COMPLIANCE_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
@Override
protected int getChunkSize() {
@ -159,7 +162,8 @@ public class ComplianceImportRangeJobConfig extends BaseMultiStepJobConfig<Compl
log.info("Compliance History Value Change Manage 프로시저 변수 (KST 변환): 시작일: {}, 종료일: {}", startDt, endDt);
// 3. 프로시저 호출 (안전한 파라미터 바인딩 권장)
jdbcTemplate.update("CALL new_snp.compliance_history_value_change_manage(CAST(? AS TIMESTAMP), CAST(? AS TIMESTAMP))", startDt, endDt);
String procedureCall = String.format("CALL %s.compliance_history_value_change_manage(CAST(? AS TIMESTAMP), CAST(? AS TIMESTAMP))", targetSchema);
jdbcTemplate.update(procedureCall, startDt, endDt);
log.info(">>>>> Compliance History Value Change Manage 프로시저 호출 완료");
return RepeatStatus.FINISHED;

파일 보기

@ -19,14 +19,16 @@ public class ComplianceDataReader extends BaseApiReader<ComplianceDto> {
// 3. Response Data -> Core20에 업데이트 (Chunk 단위로 반복)
private final JdbcTemplate jdbcTemplate;
private final String targetSchema;
private List<String> allImoNumbers;
private int currentBatchIndex = 0;
private final int batchSize = 100;
public ComplianceDataReader(WebClient webClient, JdbcTemplate jdbcTemplate) {
public ComplianceDataReader(WebClient webClient, JdbcTemplate jdbcTemplate, String targetSchema) {
super(webClient);
this.jdbcTemplate = jdbcTemplate;
this.targetSchema = targetSchema;
enableChunkMode(); // Chunk 모드 활성화
}
@ -47,16 +49,17 @@ public class ComplianceDataReader extends BaseApiReader<ComplianceDto> {
}
private String getTargetTable(){
return "snp_data.core20";
return targetSchema + ".ship_data";
}
private String getImoQuery() {
return "select imo_number as ihslrorimoshipno from " + getTargetTable() + " order by imo_number";
}
private String GET_CORE_IMO_LIST =
// "SELECT ihslrorimoshipno FROM " + getTargetTable() + " ORDER BY ihslrorimoshipno";
"select imo_number as ihslrorimoshipno from snp_data.ship_data order by imo_number";
@Override
protected void beforeFetch(){
log.info("[{}] Core20 테이블에서 IMO 번호 조회 시작...", getReaderName());
allImoNumbers = jdbcTemplate.queryForList(GET_CORE_IMO_LIST, String.class);
allImoNumbers = jdbcTemplate.queryForList(getImoQuery(), String.class);
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.compliance.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.compliance.batch.entity.CompanyComplianceEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -14,13 +15,25 @@ import java.util.List;
@Slf4j
@Repository("CompanyComplianceRepository")
public class CompanyComplianceRepositoryImpl extends BaseJdbcRepository<CompanyComplianceEntity, Long> implements CompanyComplianceRepository{
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.risk-compliance-003}")
private String tableName;
public CompanyComplianceRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "t_snp_data.tb_company_compliance_info";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -42,11 +55,11 @@ public class CompanyComplianceRepositoryImpl extends BaseJdbcRepository<CompanyC
protected String getUpdateSql() {
return """
INSERT INTO %s(
owcode, lastupdated,
companyoverallcompliancestatus, companyonaustraliansanctionlist, companyonbessanctionlist, companyoncanadiansanctionlist, companyinofacsanctionedcountry,
companyinfatfjurisdiction, companyoneusanctionlist, companyonofacsanctionlist, companyonofacnonsdnsanctionlist, companyonofacssilist,
companyonswisssanctionlist, companyonuaesanctionlist, companyonunsanctionlist, parentcompanycompliancerisk,
job_execution_id, created_by
company_cd, lst_mdfcn_dt,
company_snths_compliance_status, company_aus_sanction_list, company_bes_sanction_list, company_can_sanction_list, company_ofac_sanction_country,
company_fatf_cmptnc_country, company_eu_sanction_list, company_ofac_sanction_list, company_ofac_non_sdn_sanction_list, company_ofacssi_sanction_list,
company_swiss_sanction_list, company_uae_sanction_list, company_un_sanction_list, prnt_company_compliance_risk,
job_execution_id, creatr_id
)VALUES(
?, ?::timestamp, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.compliance.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.compliance.batch.entity.ComplianceEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -15,13 +16,24 @@ import java.util.List;
@Repository("ComplianceRepository")
public class ComplianceRepositoryImpl extends BaseJdbcRepository<ComplianceEntity, Long> implements ComplianceRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.risk-compliance-002}")
private String tableName;
public ComplianceRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "t_snp_data.compliance";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -43,18 +55,18 @@ public class ComplianceRepositoryImpl extends BaseJdbcRepository<ComplianceEntit
protected String getUpdateSql() {
return """
INSERT INTO %s (
lrimoshipno, dateamended, legaloverall, shipbessanctionlist, shipdarkactivityindicator,
shipdetailsnolongermaintained, shipeusanctionlist, shipflagdisputed, shipflagsanctionedcountry,
shiphistoricalflagsanctionedcountry, shipofacnonsdnsanctionlist, shipofacsanctionlist,
shipofacadvisorylist, shipownerofacssilist, shipowneraustraliansanctionlist, shipownerbessanctionlist,
shipownercanadiansanctionlist, shipownereusanctionlist, shipownerfatfjurisdiction,
shipownerhistoricalofacsanctionedcountry, shipownerofacsanctionlist, shipownerofacsanctionedcountry,
shipownerparentcompanynoncompliance, shipownerparentfatfjurisdiction, shipownerparentofacsanctionedcountry,
shipownerswisssanctionlist, shipowneruaesanctionlist, shipownerunsanctionlist,
shipsanctionedcountryportcalllast12m, shipsanctionedcountryportcalllast3m, shipsanctionedcountryportcalllast6m,
shipsecuritylegaldisputeevent, shipstspartnernoncompliancelast12m, shipswisssanctionlist,
shipunsanctionlist,
job_execution_id, created_by
imo_no, last_mdfcn_dt, lgl_snths_sanction, ship_bes_sanction_list, ship_dark_actv_ind,
ship_dtld_info_ntmntd, ship_eu_sanction_list, ship_flg_dspt, ship_flg_sanction_country,
ship_flg_sanction_country_hstry, ship_ofac_non_sdn_sanction_list, ship_ofac_sanction_list,
ship_ofac_cutn_list, ship_ownr_ofcs_sanction_list, ship_ownr_aus_sanction_list, ship_ownr_bes_sanction_list,
ship_ownr_can_sanction_list, ship_ownr_eu_sanction_list, ship_ownr_fatf_rgl_zone,
ship_ownr_ofac_sanction_hstry, ship_ownr_ofac_sanction_list, ship_ownr_ofac_sanction_country,
ship_ownr_prnt_company_ncmplnc, ship_ownr_prnt_company_fatf_rgl_zone, ship_ownr_prnt_company_ofac_sanction_country,
ship_ownr_swi_sanction_list, ship_ownr_uae_sanction_list, ship_ownr_un_sanction_list,
ship_sanction_country_prtcll_last_twelve_m, ship_sanction_country_prtcll_last_thr_m, ship_sanction_country_prtcll_last_six_m,
ship_scrty_lgl_dspt_event, ship_sts_prtnr_non_compliance_twelve_m, ship_swi_sanction_list,
ship_un_sanction_list,
job_execution_id, creatr_id
)
VALUES (
?, ?::timestamptz, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,

파일 보기

@ -42,9 +42,12 @@ public class EventImportJobConfig extends BaseMultiStepJobConfig<EventDetailDto,
@Value("${app.batch.ship-api.url}")
private String maritimeApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "EVENT_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
@Override
protected int getChunkSize() {

파일 보기

@ -5,9 +5,8 @@ import com.snp.batch.jobs.event.batch.entity.CargoEntity;
import com.snp.batch.jobs.event.batch.entity.EventDetailEntity;
import com.snp.batch.jobs.event.batch.entity.HumanCasualtyEntity;
import com.snp.batch.jobs.event.batch.entity.RelationshipEntity;
import com.snp.batch.jobs.shipdetail.batch.entity.GroupBeneficialOwnerHistoryEntity;
import com.snp.batch.jobs.shipdetail.batch.repository.ShipDetailSql;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -20,13 +19,24 @@ import java.util.List;
@Repository("EventRepository")
public class EventRepositoryImpl extends BaseJdbcRepository<EventDetailEntity, Long> implements EventRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.event-001}")
private String tableName;
public EventRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return null;
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override

파일 보기

@ -1,21 +1,65 @@
package com.snp.batch.jobs.event.batch.repository;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
/**
* Event 관련 SQL 생성 클래스
* application.yml의 app.batch.target-schema.name 값을 사용
*/
@Component
public class EventSql {
private static String targetSchema;
private static String eventTable;
private static String eventCargoTable;
private static String eventRelationshipTable;
private static String eventHumanCasualtyTable;
@Value("${app.batch.target-schema.name}")
public void setTargetSchema(String schema) {
EventSql.targetSchema = schema;
}
@Value("${app.batch.target-schema.tables.event-001}")
public void setEventTable(String table) {
EventSql.eventTable = table;
}
@Value("${app.batch.target-schema.tables.event-002}")
public void setEventCargoTable(String table) {
EventSql.eventCargoTable = table;
}
@Value("${app.batch.target-schema.tables.event-004}")
public void setEventRelationshipTable(String table) {
EventSql.eventRelationshipTable = table;
}
@Value("${app.batch.target-schema.tables.event-003}")
public void setEventHumanCasualtyTable(String table) {
EventSql.eventHumanCasualtyTable = table;
}
public static String getTargetSchema() {
return targetSchema;
}
public static String getEventDetailUpdateSql(){
return """
INSERT INTO t_snp_data.event (
event_id, incident_id, ihslrorimoshipno, published_date, event_start_date, event_end_date,
attempted_boarding, cargo_loading_status_code, casualty_action,
casualty_zone, casualty_zone_code, component2, country_code,
date_of_build, description, environment_location, location_name,
marsden_grid_reference, town_name, event_type, event_type_detail,
event_type_detail_id, event_type_id, fired_upon, headline,
ldt_at_time, significance, weather, pollutant, pollutant_quantity,
pollutant_unit, registered_owner_code_at_time, registered_owner_at_time,
registered_owner_country_code_at_time, registered_owner_country_at_time,
vessel_dwt, vessel_flag_code, vessel_flag_decode, vessel_gt,
vessel_name, vessel_type, vessel_type_decode,
job_execution_id, created_by
INSERT INTO %s.%s (
event_id, acdnt_id, imo_no, pstg_ymd, event_start_day, event_end_day,
embrk_try_yn, cargo_capacity_status_cd, acdnt_actn,
acdnt_zone, acdnt_zone_cd, cfg_cmpnt_two, country_cd,
build_ymd, event_expln, env_position, position_nm,
masd_grid_ref, cty_nm, event_type, event_type_dtl,
event_type_dtl_id, event_type_id, firedupon_yn, sj,
ldt_timpt, signfct, wethr, pltn_matral, pltn_matral_cnt,
pltn_matral_unit, reg_shponr_cd_hr, reg_shponr_hr,
reg_shponr_country_cd_hr, reg_shponr_country_hr,
ship_dwt, ship_flg_cd, ship_flg_decd, ship_gt,
ship_nm, ship_type, ship_type_nm,
job_execution_id, creatr_id
)
VALUES (
?, ?, ?, ?::timestamptz,?::timestamptz,?::timestamptz, ?, ?, ?, ?, ?, ?,
@ -24,49 +68,49 @@ public class EventSql {
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?
);
""";
""".formatted(targetSchema, eventTable);
}
public static String getEventCargoSql(){
return """
INSERT INTO t_snp_data.event_cargo (
event_id, "sequence", ihslrorimoshipno, "type", quantity,
unit_short, unit, cargo_damage, dangerous, "text",
job_execution_id, created_by
INSERT INTO %s.%s (
event_id, event_seq, imo_no, "type", cnt,
unit_abbr, unit, cargo_damg, risk_yn, "text",
job_execution_id, creatr_id
)
VALUES (
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?
);
""";
""".formatted(targetSchema, eventCargoTable);
}
public static String getEventRelationshipSql(){
return """
INSERT INTO t_snp_data.event_relationship (
incident_id, event_id, relationship_type, relationship_type_code,
event_id_2, event_type, event_type_code,
job_execution_id, created_by
INSERT INTO %s.%s (
acdnt_id, event_id, rel_type, rel_type_cd,
event_id_two, event_type, event_type_cd,
job_execution_id, creatr_id
)
VALUES (
?, ?, ?, ?,
?, ?, ?,
?, ?
);
""";
""".formatted(targetSchema, eventRelationshipTable);
}
public static String getEventHumanCasualtySql(){
return """
INSERT INTO t_snp_data.event_humancasualty (
event_id, "scope", "type", qualifier, "count",
job_execution_id, created_by
INSERT INTO %s.%s (
event_id, "scope", "type", qualfr, cnt,
job_execution_id, creatr_id
)
VALUES (
?, ?, ?, ?, ?,
?, ?
);
""";
""".formatted(targetSchema, eventHumanCasualtyTable);
}
}

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.facility.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.facility.batch.entity.PortEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -15,13 +16,24 @@ import java.util.List;
@Repository("FacilityRepository")
public class FacilityRepositoryImpl extends BaseJdbcRepository<PortEntity, Long> implements FacilityRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.facility-001}")
private String tableName;
public FacilityRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "t_snp_data.facility_port";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -43,13 +55,13 @@ public class FacilityRepositoryImpl extends BaseJdbcRepository<PortEntity, Long>
protected String getUpdateSql() {
return """
INSERT INTO %s(
port_ID, old_ID, status, port_Name, unlocode, countryCode, country_Name, region_Name, continent_Name, master_POID,
dec_Lat, dec_Long, position_lat, position_long, position_z, position_m, position_hasZ, position_hasM, position_isNull, position_stSrid, time_Zone, dayLight_Saving_Time,
maximum_Draft, max_LOA, max_Beam, max_DWT, max_Offshore_Draught, max_Offshore_LOA, max_Offshore_BCM, max_Offshore_DWT,
breakbulk_Facilities, container_Facilities, dry_Bulk_Facilities, liquid_Facilities, roRo_Facilities, passenger_Facilities, dry_Dock_Facilities,
lpG_Facilities, lnG_Facilities, lnG_Bunker, dO_Bunker, fO_Bunker, ispS_Compliant, csI_Compliant, free_Trade_Zone, ecO_Port, emission_Control_Area, wS_Port,
last_Update, entry_Date,
job_execution_id, created_by
port_id, bfr_id, status, port_nm, un_port_cd, country_cd, country_nm, areanm, cntntnm, mst_port_id,
lat_decml, lon_decml, position_lat, position_lon, position_z_val, position_mval_val, z_val_has_yn, mval_val_has_yn, position_nul_yn, position_sts_id, hr_zone, daylgt_save_hr,
max_draft, max_whlnth, max_beam, max_dwt, max_sea_draft, max_sea_whlnth, max_sea_bcm, max_sea_dwt,
bale_cargo_facility, cntnr_facility, case_cargo_facility, liquid_cargo_facility, roro_facility, paxfclty, drydkfclty,
lpg_facility, lng_facility, lng_bnkr, do_bnkr, fo_bnkr, isps_compliance_yn, csi_compliance_yn, free_trd_zone, ecfrd_port, emsn_ctrl_area, ws_port,
last_mdfcn_dt, reg_ymd,
job_execution_id, creatr_id
) VALUES (
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,

파일 보기

@ -44,9 +44,13 @@ public class AnchorageCallsRangeJobConfig extends BaseMultiStepJobConfig<Anchora
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "ANCHORAGE_CALLS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public AnchorageCallsRangeJobConfig(

파일 보기

@ -43,9 +43,13 @@ public class BerthCallsRangJobConfig extends BaseMultiStepJobConfig<BerthCallsDt
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "BERTH_CALLS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public BerthCallsRangJobConfig(
JobRepository jobRepository,

파일 보기

@ -43,9 +43,13 @@ public class CurrentlyAtRangeJobConfig extends BaseMultiStepJobConfig<CurrentlyA
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "CURRENTLY_AT_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public CurrentlyAtRangeJobConfig(
JobRepository jobRepository,

파일 보기

@ -1,106 +0,0 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.jobs.movement.batch.processor.DarkActivityProcessor;
import com.snp.batch.jobs.movement.batch.reader.DarkActivityReader;
import com.snp.batch.jobs.movement.batch.writer.DarkActivityWriter;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.web.reactive.function.client.WebClient;
/**
* 선박 상세 정보 Import Job Config
*
* 특징:
* - ship_data 테이블에서 IMO 번호 조회
* - IMO 번호를 100개씩 배치로 분할
* - Maritime API GetShipsByIHSLRorIMONumbers 호출
* TODO : GetShipsByIHSLRorIMONumbersAll 호출로 변경
* - 선박 상세 정보를 ship_detail 테이블에 저장 (UPSERT)
*
* 데이터 흐름:
* DarkActivityReader (ship_data Maritime API)
* (DarkActivityDto)
* DarkActivityProcessor
* (DarkActivityEntity)
* DarkActivityWriter
* (t_darkactivity 테이블)
*/
@Slf4j
@Configuration
public class DarkActivityJobConfig extends BaseJobConfig<DarkActivityDto, DarkActivityEntity> {
private final DarkActivityProcessor darkActivityProcessor;
private final DarkActivityWriter darkActivityWriter;
private final JdbcTemplate jdbcTemplate;
private final WebClient maritimeApiWebClient;
public DarkActivityJobConfig(
JobRepository jobRepository,
PlatformTransactionManager transactionManager,
DarkActivityProcessor darkActivityProcessor,
DarkActivityWriter darkActivityWriter, JdbcTemplate jdbcTemplate,
@Qualifier("maritimeServiceApiWebClient") WebClient maritimeApiWebClient,
ObjectMapper objectMapper) { // ObjectMapper 주입 추가
super(jobRepository, transactionManager);
this.darkActivityProcessor = darkActivityProcessor;
this.darkActivityWriter = darkActivityWriter;
this.jdbcTemplate = jdbcTemplate;
this.maritimeApiWebClient = maritimeApiWebClient;
}
@Override
protected String getJobName() {
return "DarkActivityImportJob";
}
@Override
protected String getStepName() {
return "DarkActivityImportStep";
}
@Override
protected ItemReader<DarkActivityDto> createReader() { // 타입 변경
// Reader 생성자 수정: ObjectMapper를 전달합니다.
return new DarkActivityReader(maritimeApiWebClient, jdbcTemplate);
}
@Override
protected ItemProcessor<DarkActivityDto, DarkActivityEntity> createProcessor() {
return darkActivityProcessor;
}
@Override
protected ItemWriter<DarkActivityEntity> createWriter() { // 타입 변경
return darkActivityWriter;
}
@Override
protected int getChunkSize() {
return 5; // API에서 100개씩 가져오므로 chunk도 100으로 설정
}
@Bean(name = "DarkActivityImportJob")
public Job darkActivityImportJob() {
return job();
}
@Bean(name = "DarkActivityImportStep")
public Step darkActivityImportStep() {
return step();
}
}

파일 보기

@ -1,119 +0,0 @@
package com.snp.batch.jobs.movement.batch.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.config.BaseJobConfig;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import com.snp.batch.jobs.movement.batch.processor.DarkActivityProcessor;
import com.snp.batch.jobs.movement.batch.writer.DarkActivityWriter;
import com.snp.batch.jobs.movement.batch.reader.DarkActivityRangeReader;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.web.reactive.function.client.WebClient;
/**
* 선박 상세 정보 Import Job Config
*
* 특징:
* - ship_data 테이블에서 IMO 번호 조회
* - IMO 번호를 100개씩 배치로 분할
* - Maritime API GetShipsByIHSLRorIMONumbers 호출
* TODO : GetShipsByIHSLRorIMONumbersAll 호출로 변경
* - 선박 상세 정보를 ship_detail 테이블에 저장 (UPSERT)
*
* 데이터 흐름:
* DarkActivityReader (ship_data Maritime API)
* (DarkActivityDto)
* DarkActivityProcessor
* (DarkActivityEntity)
* DarkActivityWriter
* (t_darkactivity 테이블)
*/
@Slf4j
@Configuration
public class DarkActivityRangeJobConfig extends BaseJobConfig<DarkActivityDto, DarkActivityEntity> {
private final DarkActivityProcessor darkActivityProcessor;
private final DarkActivityWriter darkActivityWriter;
private final DarkActivityRangeReader darkActivityRangeReader;
private final JdbcTemplate jdbcTemplate;
private final WebClient maritimeApiWebClient;
public DarkActivityRangeJobConfig(
JobRepository jobRepository,
PlatformTransactionManager transactionManager,
DarkActivityProcessor darkActivityProcessor,
DarkActivityWriter darkActivityWriter, JdbcTemplate jdbcTemplate,
@Qualifier("maritimeServiceApiWebClient") WebClient maritimeApiWebClient,
ObjectMapper objectMapper, DarkActivityRangeReader darkActivityRangeReader) { // ObjectMapper 주입 추가
super(jobRepository, transactionManager);
this.darkActivityProcessor = darkActivityProcessor;
this.darkActivityWriter = darkActivityWriter;
this.jdbcTemplate = jdbcTemplate;
this.maritimeApiWebClient = maritimeApiWebClient;
this.darkActivityRangeReader = darkActivityRangeReader;
}
@Override
protected String getJobName() {
return "DarkActivityRangeImportJob";
}
@Override
protected String getStepName() {
return "DarkActivityRangeImportStep";
}
@Override
protected ItemReader<DarkActivityDto> createReader() { // 타입 변경
// Reader 생성자 수정: ObjectMapper를 전달합니다.
return darkActivityRangeReader;
}
@Bean
@StepScope
public DarkActivityRangeReader darkActivityReader(
@Value("#{jobParameters['startDate']}") String startDate,
@Value("#{jobParameters['stopDate']}") String stopDate
) {
// jobParameters 없으면 null 넘어오고 Reader에서 default 처리
return new DarkActivityRangeReader(maritimeApiWebClient, startDate, stopDate);
}
@Override
protected ItemProcessor<DarkActivityDto, DarkActivityEntity> createProcessor() {
return darkActivityProcessor;
}
@Override
protected ItemWriter<DarkActivityEntity> createWriter() { // 타입 변경
return darkActivityWriter;
}
@Override
protected int getChunkSize() {
return 5000; // API에서 100개씩 가져오므로 chunk도 100으로 설정
}
@Bean(name = "DarkActivityRangeImportJob")
public Job darkActivityRangeImportJob() {
return job();
}
@Bean(name = "DarkActivityRangeImportStep")
public Step darkActivityRangeImportStep() {
return step();
}
}

파일 보기

@ -43,9 +43,13 @@ public class DestinationsRangeJobConfig extends BaseMultiStepJobConfig<Destinati
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "DESTINATIONS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public DestinationsRangeJobConfig(

파일 보기

@ -43,9 +43,13 @@ public class ShipPortCallsRangeJobConfig extends BaseMultiStepJobConfig<PortCall
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "PORT_CALLS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public ShipPortCallsRangeJobConfig(
JobRepository jobRepository,

파일 보기

@ -43,9 +43,13 @@ public class StsOperationRangeJobConfig extends BaseMultiStepJobConfig<StsOperat
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "STS_OPERATION_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public StsOperationRangeJobConfig(

파일 보기

@ -43,9 +43,13 @@ public class TerminalCallsRangeJobConfig extends BaseMultiStepJobConfig<Terminal
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "TERMINAL_CALLS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public TerminalCallsRangeJobConfig(

파일 보기

@ -42,9 +42,13 @@ public class TransitsRangeJobConfig extends BaseMultiStepJobConfig<TransitsDto,
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "TRANSITS_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public TransitsRangeJobConfig(
JobRepository jobRepository,

파일 보기

@ -1,29 +0,0 @@
package com.snp.batch.jobs.movement.batch.dto;
import lombok.Data;
@Data
public class DarkActivityDto {
private String movementType;
private String imolRorIHSNumber;
private String movementDate;
private Integer facilityId;
private String facilityName;
private String facilityType;
private Integer subFacilityId;
private String subFacilityName;
private String subFacilityType;
private String countryCode;
private String countryName;
private Double draught;
private Double latitude;
private Double longitude;
private DarkActivityPositionDto position;
private String eventStartDate;
}

파일 보기

@ -1,17 +0,0 @@
package com.snp.batch.jobs.movement.batch.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import lombok.Data;
@Data
public class DarkActivityPositionDto {
private boolean isNull;
private int stSrid;
private double lat;
@JsonProperty("long")
private double lon;
private double z;
private double m;
private boolean hasZ;
private boolean hasM;
}

파일 보기

@ -1,44 +0,0 @@
package com.snp.batch.jobs.movement.batch.entity;
import com.fasterxml.jackson.databind.JsonNode;
import com.snp.batch.common.batch.entity.BaseEntity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
import java.time.LocalDateTime;
@Data
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
@EqualsAndHashCode(callSuper = true)
public class DarkActivityEntity extends BaseEntity {
private Long id;
private String movementType;
private String imolRorIHSNumber;
private LocalDateTime movementDate;
private Integer facilityId;
private String facilityName;
private String facilityType;
private Integer subFacilityId;
private String subFacilityName;
private String subFacilityType;
private String countryCode;
private String countryName;
private Double draught;
private Double latitude;
private Double longitude;
private JsonNode position;
private LocalDateTime eventStartDate;
}

파일 보기

@ -1,66 +0,0 @@
package com.snp.batch.jobs.movement.batch.processor;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
/**
* 선박 상세 정보 Processor
* ShipDetailDto ShipDetailEntity 변환
*/
/**
* 선박 상세 정보 Processor (해시 비교 증분 데이터 추출)
* I: ShipDetailComparisonData (DB 해시 + API Map Data)
* O: ShipDetailUpdate (변경분)
*/
@Slf4j
@Component
public class DarkActivityProcessor extends BaseProcessor<DarkActivityDto, DarkActivityEntity> {
private final ObjectMapper objectMapper;
public DarkActivityProcessor(ObjectMapper objectMapper) {
this.objectMapper = objectMapper;
}
@Override
protected DarkActivityEntity processItem(DarkActivityDto dto) throws Exception {
log.debug("선박 상세 정보 처리 시작: imoNumber={}, facilityName={}",
dto.getImolRorIHSNumber(), dto.getFacilityName());
JsonNode positionNode = null;
if (dto.getPosition() != null) {
// Position 객체를 JsonNode로 변환
positionNode = objectMapper.valueToTree(dto.getPosition());
}
DarkActivityEntity entity = DarkActivityEntity.builder()
.movementType(dto.getMovementType())
.imolRorIHSNumber(dto.getImolRorIHSNumber())
.movementDate(LocalDateTime.parse(dto.getMovementDate()))
.facilityId(dto.getFacilityId())
.facilityName(dto.getFacilityName())
.facilityType(dto.getFacilityType())
.subFacilityId(dto.getSubFacilityId())
.subFacilityName(dto.getSubFacilityName())
.subFacilityType(dto.getSubFacilityType())
.countryCode(dto.getCountryCode())
.countryName(dto.getCountryName())
.draught(dto.getDraught())
.latitude(dto.getLatitude())
.longitude(dto.getLongitude())
.position(positionNode) // JsonNode로 매핑
.eventStartDate(LocalDateTime.parse(dto.getEventStartDate()))
.build();
return entity;
}
}

파일 보기

@ -1,182 +0,0 @@
package com.snp.batch.jobs.movement.batch.reader;
import com.snp.batch.common.batch.reader.BaseApiReader;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.web.reactive.function.client.WebClient;
import java.time.LocalDate;
import java.time.format.DateTimeFormatter;
import java.util.List;
/**
* 선박 상세 정보 Reader (v2.0 - Chunk 기반)
*
* 기능:
* 1. ship_data 테이블에서 IMO 번호 전체 조회 (최초 1회)
* 2. IMO 번호를 100개씩 분할하여 배치 단위로 처리
* 3. fetchNextBatch() 호출 시마다 100개씩 API 호출
* 4. Spring Batch가 100건씩 Process Write 수행
*
* Chunk 처리 흐름:
* - beforeFetch() IMO 전체 조회 (1회)
* - fetchNextBatch() 100개 IMO로 API 호출 (1,718회)
* - read() 1건씩 반환 (100번)
* - Processor/Writer 100건 처리
* - 반복... (1,718번의 Chunk)
*
* 기존 방식과의 차이:
* - 기존: 17만건 전체 메모리 로드 Process Write
* - 신규: 100건씩 로드 Process Write (Chunk 1,718회)
*/
@Slf4j
@StepScope
public class DarkActivityRangeReader extends BaseApiReader<DarkActivityDto> {
private List<DarkActivityDto> allData;
// DB 해시값을 저장할
private int currentBatchIndex = 0;
private final int batchSize = 5000;
// @Value("#{jobParameters['startDate']}")
private String startDate;
// private String startDate = "2025-01-01";
// @Value("#{jobParameters['stopDate']}")
private String stopDate;
// private String stopDate = "2025-12-31";
/*public DarkActivityRangeReader(WebClient webClient) {
super(webClient);
enableChunkMode(); // Chunk 모드 활성화
}*/
public DarkActivityRangeReader(WebClient webClient,
@Value("#{jobParameters['startDate']}") String startDate,
@Value("#{jobParameters['stopDate']}") String stopDate) {
super(webClient);
// 날짜가 없으면 전날 하루 기준
if (startDate == null || startDate.isBlank() || stopDate == null || stopDate.isBlank()) {
LocalDate yesterday = LocalDate.now().minusDays(1);
this.startDate = yesterday.atStartOfDay().format(DateTimeFormatter.ISO_DATE_TIME) + "Z";
this.stopDate = yesterday.plusDays(1).atStartOfDay().format(DateTimeFormatter.ISO_DATE_TIME) + "Z";
} else {
this.startDate = startDate;
this.stopDate = stopDate;
}
enableChunkMode(); // Chunk 모드 활성화
}
@Override
protected String getReaderName() {
return "DarkActivityReader";
}
@Override
protected void resetCustomState() {
this.currentBatchIndex = 0;
this.allData = null;
}
@Override
protected String getApiPath() {
return "/Movements/DarkActivity";
}
@Override
protected String getApiBaseUrl() {
return "https://webservices.maritime.spglobal.com";
}
private static final String GET_ALL_IMO_QUERY =
"SELECT imo_number FROM ship_data ORDER BY id";
// "SELECT imo_number FROM snp_data.ship_data where imo_number > (select max(imo) from snp_data.t_darkactivity) ORDER BY imo_number";
/**
* 최초 1회만 실행: ship_data 테이블에서 IMO 번호 전체 조회
*/
@Override
protected void beforeFetch() {
log.info("[{}] 요청 날짜 범위: {} → {}", getReaderName(), startDate, stopDate);
}
/**
* Chunk 기반 핵심 메서드: 다음 100개 배치를 조회하여 반환
*
* Spring Batch가 100건씩 read() 호출 완료 메서드 재호출
*
* @return 다음 배치 100건 ( 이상 없으면 null)
*/
@Override
protected List<DarkActivityDto> fetchNextBatch() throws Exception {
// 모든 배치 처리 완료 확인
if (allData == null ) {
log.info("[{}] 최초 API 조회 실행: {} ~ {}", getReaderName(), startDate, stopDate);
allData = callApiWithBatch(startDate, stopDate);
if (allData == null || allData.isEmpty()) {
log.warn("[{}] 조회된 데이터 없음 → 종료", getReaderName());
return null;
}
log.info("[{}] 총 {}건 데이터 조회됨. batchSize = {}", getReaderName(), allData.size(), batchSize);
}
// 2) 이미 끝까지 읽었으면 종료
if (currentBatchIndex >= allData.size()) {
log.info("[{}] 모든 배치 처리 완료", getReaderName());
return null;
}
// 3) 이번 배치의 end 계산
int endIndex = Math.min(currentBatchIndex + batchSize, allData.size());
// 현재 배치의 IMO 번호 추출 (100개)
List<DarkActivityDto> batch = allData.subList(currentBatchIndex, endIndex);
int currentBatchNumber = (currentBatchIndex / batchSize) + 1;
int totalBatches = (int) Math.ceil((double) allData.size() / batchSize);
log.info("[{}] 배치 {}/{} 처리 중: {}건", getReaderName(), currentBatchNumber, totalBatches, batch.size());
currentBatchIndex = endIndex;
updateApiCallStats(totalBatches, currentBatchNumber);
return batch;
}
/**
* Query Parameter를 사용한 API 호출
*
* @param startDate,stopDate
* @return API 응답
*/
private List<DarkActivityDto> callApiWithBatch(String startDate, String stopDate){
String url = getApiPath() + "?startDate=" + startDate +"&stopDate="+stopDate;
// +"&lrno=" + lrno;
log.debug("[{}] API 호출: {}", getReaderName(), url);
return webClient.get()
.uri(url)
.retrieve()
.bodyToFlux(DarkActivityDto.class)
.collectList()
.block();
}
@Override
protected void afterFetch(List<DarkActivityDto> data) {
if (data == null) {
int totalBatches = (int) Math.ceil((double) allData.size() / batchSize);
log.info("[{}] 전체 {} 개 배치 처리 완료", getReaderName(), totalBatches);
/* log.info("[{}] 총 {} 개의 IMO 번호에 대한 API 호출 종료",
getReaderName(), allData.size());*/
}
}
}

파일 보기

@ -1,212 +0,0 @@
package com.snp.batch.jobs.movement.batch.reader;
import com.snp.batch.common.batch.reader.BaseApiReader;
import com.snp.batch.jobs.movement.batch.dto.DarkActivityDto;
import lombok.extern.slf4j.Slf4j;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.web.reactive.function.client.WebClient;
import java.util.Collections;
import java.util.List;
import java.util.Map;
/**
* 선박 상세 정보 Reader (v2.0 - Chunk 기반)
*
* 기능:
* 1. ship_data 테이블에서 IMO 번호 전체 조회 (최초 1회)
* 2. IMO 번호를 100개씩 분할하여 배치 단위로 처리
* 3. fetchNextBatch() 호출 시마다 100개씩 API 호출
* 4. Spring Batch가 100건씩 Process Write 수행
*
* Chunk 처리 흐름:
* - beforeFetch() IMO 전체 조회 (1회)
* - fetchNextBatch() 100개 IMO로 API 호출 (1,718회)
* - read() 1건씩 반환 (100번)
* - Processor/Writer 100건 처리
* - 반복... (1,718번의 Chunk)
*
* 기존 방식과의 차이:
* - 기존: 17만건 전체 메모리 로드 Process Write
* - 신규: 100건씩 로드 Process Write (Chunk 1,718회)
*/
@Slf4j
@StepScope
public class DarkActivityReader extends BaseApiReader<DarkActivityDto> {
private final JdbcTemplate jdbcTemplate;
// 배치 처리 상태
private List<String> allImoNumbers;
// DB 해시값을 저장할
private Map<String, String> dbMasterHashes;
private int currentBatchIndex = 0;
private final int batchSize = 5;
// @Value("#{jobParameters['startDate']}")
// private String startDate;
private String startDate = "2025-01-01";
// @Value("#{jobParameters['stopDate']}")
// private String stopDate;
private String stopDate = "2025-12-31";
public DarkActivityReader(WebClient webClient, JdbcTemplate jdbcTemplate ) {
super(webClient);
this.jdbcTemplate = jdbcTemplate;
enableChunkMode(); // Chunk 모드 활성화
}
@Override
protected String getReaderName() {
return "DarkActivityReader";
}
@Override
protected void resetCustomState() {
this.currentBatchIndex = 0;
this.allImoNumbers = null;
this.dbMasterHashes = null;
}
@Override
protected String getApiPath() {
return "/Movements/DarkActivity";
}
@Override
protected String getApiBaseUrl() {
return "https://webservices.maritime.spglobal.com";
}
private static final String GET_ALL_IMO_QUERY =
"SELECT imo_number FROM ship_data ORDER BY id";
// "SELECT imo_number FROM snp_data.ship_data where imo_number > (select max(imo) from snp_data.t_darkactivity) ORDER BY imo_number";
/**
* 최초 1회만 실행: ship_data 테이블에서 IMO 번호 전체 조회
*/
@Override
protected void beforeFetch() {
// 전처리 과정
// Step 1. IMO 전체 번호 조회
log.info("[{}] ship_data 테이블에서 IMO 번호 조회 시작...", getReaderName());
allImoNumbers = jdbcTemplate.queryForList(GET_ALL_IMO_QUERY, String.class);
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);
log.info("[{}] 총 {} 개의 IMO 번호 조회 완료", getReaderName(), allImoNumbers.size());
log.info("[{}] {}개씩 배치로 분할하여 API 호출 예정", getReaderName(), batchSize);
log.info("[{}] 예상 배치 수: {} 개", getReaderName(), totalBatches);
// API 통계 초기화
updateApiCallStats(totalBatches, 0);
}
/**
* Chunk 기반 핵심 메서드: 다음 100개 배치를 조회하여 반환
*
* Spring Batch가 100건씩 read() 호출 완료 메서드 재호출
*
* @return 다음 배치 100건 ( 이상 없으면 null)
*/
@Override
protected List<DarkActivityDto> fetchNextBatch() throws Exception {
// 모든 배치 처리 완료 확인
if (allImoNumbers == null || currentBatchIndex >= allImoNumbers.size()) {
return null; // Job 종료
}
// 현재 배치의 시작/ 인덱스 계산
int startIndex = currentBatchIndex;
int endIndex = Math.min(currentBatchIndex + batchSize, allImoNumbers.size());
// 현재 배치의 IMO 번호 추출 (100개)
List<String> currentBatch = allImoNumbers.subList(startIndex, endIndex);
int currentBatchNumber = (currentBatchIndex / batchSize) + 1;
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);
log.info("[{}] 배치 {}/{} 처리 중 (IMO {} 개)...",
getReaderName(), currentBatchNumber, totalBatches, currentBatch.size());
try {
// IMO 번호를 쉼표로 연결 (: "1000019,1000021,1000033,...")
String imoParam = String.join(",", currentBatch);
// API 호출
List<DarkActivityDto> response = callApiWithBatch(imoParam);
// 다음 배치로 인덱스 이동
currentBatchIndex = endIndex;
// 응답 처리
if (response != null ) {
List<DarkActivityDto> darkActivityList = response;
log.info("[{}] 배치 {}/{} 완료: {} 건 조회",
getReaderName(), currentBatchNumber, totalBatches, darkActivityList.size());
// API 호출 통계 업데이트
updateApiCallStats(totalBatches, currentBatchNumber);
// API 과부하 방지 (다음 배치 0.5초 대기)
if (currentBatchIndex < allImoNumbers.size()) {
Thread.sleep(500);
}
return darkActivityList;
} else {
log.warn("[{}] 배치 {}/{} 응답 없음",
getReaderName(), currentBatchNumber, totalBatches);
// API 호출 통계 업데이트 (실패도 카운트)
updateApiCallStats(totalBatches, currentBatchNumber);
return Collections.emptyList();
}
} catch (Exception e) {
log.error("[{}] 배치 {}/{} 처리 중 오류: {}",
getReaderName(), currentBatchNumber, totalBatches, e.getMessage(), e);
// 오류 발생 시에도 다음 배치로 이동 (부분 실패 허용)
currentBatchIndex = endIndex;
// 리스트 반환 (Job 계속 진행)
return Collections.emptyList();
}
}
/**
* Query Parameter를 사용한 API 호출
*
* @param lrno 쉼표로 연결된 IMO 번호 (: "1000019,1000021,...")
* @return API 응답
*/
private List<DarkActivityDto> callApiWithBatch(String lrno) {
String url = getApiPath() + "?startDate=" + startDate +"&stopDate="+stopDate+"&lrno=" + lrno;
log.debug("[{}] API 호출: {}", getReaderName(), url);
return webClient.get()
.uri(url)
.retrieve()
.bodyToFlux(DarkActivityDto.class)
.collectList()
.block();
}
@Override
protected void afterFetch(List<DarkActivityDto> data) {
if (data == null) {
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);
log.info("[{}] 전체 {} 개 배치 처리 완료", getReaderName(), totalBatches);
log.info("[{}] 총 {} 개의 IMO 번호에 대한 API 호출 종료",
getReaderName(), allImoNumbers.size());
}
}
}

파일 보기

@ -4,6 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.AnchorageCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,25 @@ import java.util.List;
public class AnchorageCallsRepositoryImpl extends BaseJdbcRepository<AnchorageCallsEntity, String>
implements AnchorageCallsRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-001}")
private String tableName;
public AnchorageCallsRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "t_snp_data.t_anchoragecall";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -41,25 +54,25 @@ public class AnchorageCallsRepositoryImpl extends BaseJdbcRepository<AnchorageCa
public String getInsertSql() {
return """
INSERT INTO %s(
imo,
imo_no,
mvmn_type,
mvmn_dt,
stpov_id,
fclty_id,
fclty_nm,
fclty_type,
lwrnk_fclty_id,
lwrnk_fclty_nm,
lwrnk_fclty_type,
ntn_cd,
ntn_nm,
prtcll_id,
facility_id,
facility_nm,
facility_type,
lwrnk_facility_id,
lwrnk_facility_desc,
lwrnk_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
dstn,
iso2_ntn_cd,
lcinfo,
job_execution_id, created_by
dest,
iso_two_country_cd,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -4,6 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.BerthCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,25 @@ import java.util.List;
public class BerthCallsRepositoryImpl extends BaseJdbcRepository<BerthCallsEntity, String>
implements BerthCallsRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-002}")
private String tableName;
public BerthCallsRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "t_snp_data.t_berthcall";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -41,25 +54,25 @@ public class BerthCallsRepositoryImpl extends BaseJdbcRepository<BerthCallsEntit
public String getInsertSql() {
return """
INSERT INTO %s(
imo,
imo_no,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
ntn_cd,
ntn_nm,
facility_id,
facility_nm,
facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
prnt_call_id,
iso2_ntn_cd,
evt_start_dt,
lcinfo,
job_execution_id, created_by
up_clot_id,
iso_two_country_cd,
event_sta_dt,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -4,6 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.CurrentlyAtEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,25 @@ import java.util.List;
public class CurrentlyAtRepositoryImpl extends BaseJdbcRepository<CurrentlyAtEntity, String>
implements CurrentlyAtRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-003}")
private String tableName;
public CurrentlyAtRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "t_snp_data.t_currentlyat";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -41,28 +54,28 @@ public class CurrentlyAtRepositoryImpl extends BaseJdbcRepository<CurrentlyAtEnt
public String getInsertSql() {
return """
INSERT INTO %s(
imo,
imo_no,
mvmn_type,
mvmn_dt,
stpov_id,
fclty_id,
fclty_nm,
fclty_type,
lwrnk_fclty_id,
lwrnk_fclty_nm,
lwrnk_fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
ntn_cd,
ntn_nm,
prtcll_id,
facility_id,
facility_nm,
facility_type,
lwrnk_facility_id,
lwrnk_facility_desc,
lwrnk_facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
dstn,
iso2_ntn_cd,
lcinfo,
job_execution_id, created_by
dest,
country_iso_two_cd,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -1,13 +0,0 @@
package com.snp.batch.jobs.movement.batch.repository;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import java.util.List;
/**
* 선박 상세 정보 Repository 인터페이스
*/
public interface DarkActivityRepository {
void saveAll(List<DarkActivityEntity> entities);
}

파일 보기

@ -1,187 +0,0 @@
package com.snp.batch.jobs.movement.batch.repository;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Timestamp;
import java.util.List;
/**
* 선박 상세 정보 Repository 구현체
* BaseJdbcRepository를 상속하여 JDBC 기반 CRUD 구현
*/
@Slf4j
@Repository("")
public class DarkActivityRepositoryImpl extends BaseJdbcRepository<DarkActivityEntity, String>
implements DarkActivityRepository {
public DarkActivityRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "new_snp.t_darkactivity";
// return "snp_data.t_darkactivity";
}
@Override
protected String getEntityName() {
return "DarkActivity";
}
@Override
protected String extractId(DarkActivityEntity entity) {
return entity.getImolRorIHSNumber();
}
@Override
public String getInsertSql() {
// return """
// INSERT INTO snp_data.t_darkactivity(
return """
INSERT INTO new_snp.t_darkactivity(
imo,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
lwrnk_fclty_id,
lwrnk_fclty_nm,
lwrnk_fclty_type,
ntn_cd,
ntn_nm,
draft,
lat,
lon,
evt_start_dt,
lcinfo
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT (imo, mvmn_type, mvmn_dt, fclty_id)
DO UPDATE SET
mvmn_type = EXCLUDED.mvmn_type,
mvmn_dt = EXCLUDED.mvmn_dt,
fclty_id = EXCLUDED.fclty_id,
fclty_nm = EXCLUDED.fclty_nm,
fclty_type = EXCLUDED.fclty_type,
lwrnk_fclty_id = EXCLUDED.lwrnk_fclty_id,
lwrnk_fclty_nm = EXCLUDED.lwrnk_fclty_nm,
lwrnk_fclty_type = EXCLUDED.lwrnk_fclty_type,
ntn_cd = EXCLUDED.ntn_cd,
ntn_nm = EXCLUDED.ntn_nm,
draft = EXCLUDED.draft,
lat = EXCLUDED.lat,
lon = EXCLUDED.lon,
evt_start_dt = EXCLUDED.evt_start_dt,
lcinfo = EXCLUDED.lcinfo
""";
}
@Override
protected String getUpdateSql() {
return null;
}
@Override
protected void setInsertParameters(PreparedStatement ps, DarkActivityEntity e) throws Exception {
int i = 1;
ps.setString(i++, e.getImolRorIHSNumber()); // imo
ps.setString(i++, e.getMovementType()); // mvmn_type
ps.setTimestamp(i++, e.getMovementDate() != null ? Timestamp.valueOf(e.getMovementDate()) : null); // mvmn_dt
ps.setObject(i++, e.getFacilityId()); // fclty_id
ps.setString(i++, e.getFacilityName()); // fclty_nm
ps.setString(i++, e.getFacilityType()); // fclty_type
ps.setObject(i++, e.getSubFacilityId()); //lwrnk_fclty_id
ps.setString(i++, e.getSubFacilityName()); // lwrnk_fclty_nm
ps.setString(i++, e.getSubFacilityType()); //lwrnk_fclty_type
ps.setString(i++, e.getCountryCode()); // ntn_cd
ps.setString(i++, e.getCountryName()); // ntn_nm
setDoubleOrNull(ps, i++, e.getDraught()); // draft
setDoubleOrNull(ps, i++, e.getLatitude()); // lat
setDoubleOrNull(ps, i++, e.getLongitude());// lon
ps.setTimestamp(i++, e.getEventStartDate() != null ? Timestamp.valueOf(e.getEventStartDate()) : null); // evt_start_dt
if (e.getPosition() != null) {
ps.setObject(i++, OBJECT_MAPPER.writeValueAsString(e.getPosition()), java.sql.Types.OTHER); // lcinfo (jsonb)
} else {
ps.setNull(i++, java.sql.Types.OTHER);
}
}
private void setDoubleOrNull(PreparedStatement ps, int index, Double value) throws Exception {
if (value != null) {
ps.setDouble(index, value);
} else {
// java.sql.Types.DOUBLE을 사용하여 명시적으로 SQL NULL을 설정
ps.setNull(index, java.sql.Types.DOUBLE);
}
}
@Override
protected void setUpdateParameters(PreparedStatement ps, DarkActivityEntity entity) throws Exception {
}
@Override
protected RowMapper<DarkActivityEntity> getRowMapper() {
return null;
}
@Override
public void saveAll(List<DarkActivityEntity> entities) {
if (entities == null || entities.isEmpty()) return;
log.info("DarkActivity 저장 시작 = {}건", entities.size());
batchInsert(entities);
}
/**
* ShipDetailEntity RowMapper
*/
private static class DarkActivityRowMapper implements RowMapper<DarkActivityEntity> {
@Override
public DarkActivityEntity mapRow(ResultSet rs, int rowNum) throws SQLException {
DarkActivityEntity entity = DarkActivityEntity.builder()
.id(rs.getLong("id"))
.imolRorIHSNumber(rs.getString("imolRorIHSNumber"))
.facilityId(rs.getObject("facilityId", Integer.class))
.facilityName(rs.getString("facilityName"))
.facilityType(rs.getString("facilityType"))
.countryCode(rs.getString("countryCode"))
.countryName(rs.getString("countryName"))
.draught(rs.getObject("draught", Double.class))
.latitude(rs.getObject("latitude", Double.class))
.longitude(rs.getObject("longitude", Double.class))
.position(parseJson(rs.getString("position")))
.build();
Timestamp movementDate = rs.getTimestamp("movementDate");
if (movementDate != null) {
entity.setMovementDate(movementDate.toLocalDateTime());
}
return entity;
}
private JsonNode parseJson(String json) {
try {
if (json == null) return null;
return new ObjectMapper().readTree(json);
} catch (Exception e) {
throw new RuntimeException("JSON 파싱 오류: " + json);
}
}
}
}

파일 보기

@ -4,6 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.DestinationEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,25 @@ import java.util.List;
public class DestinationRepositoryImpl extends BaseJdbcRepository<DestinationEntity, String>
implements DestinationRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-004}")
private String tableName;
public DestinationRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "t_snp_data.t_destination";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -39,23 +52,21 @@ public class DestinationRepositoryImpl extends BaseJdbcRepository<DestinationEnt
@Override
public String getInsertSql() {
/*return """
INSERT INTO snp_data.t_destination(*/
return """
INSERT INTO %s(
imo,
imo_no,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
ntn_cd,
ntn_nm,
facility_id,
facility_nm,
facility_type,
country_cd,
country_nm,
lat,
lon,
iso2_ntn_cd,
lcinfo,
job_execution_id, created_by
country_iso_two_cd,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -4,6 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.PortCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,25 @@ import java.util.List;
public class PortCallsRepositoryImpl extends BaseJdbcRepository<PortCallsEntity, String>
implements PortCallsRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-005}")
private String tableName;
public PortCallsRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "t_snp_data.t_ship_stpov_info";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -41,28 +54,28 @@ public class PortCallsRepositoryImpl extends BaseJdbcRepository<PortCallsEntity,
public String getInsertSql() {
return """
INSERT INTO %s(
imo,
imo_no,
mvmn_type,
mvmn_dt,
stpov_id,
fclty_id,
fclty_nm,
fclty_type,
lwrnk_fclty_id,
lwrnk_fclty_nm,
lwrnk_fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
ntn_cd,
ntn_nm,
prtcll_id,
facility_id,
facility_nm,
facility_type,
lwrnk_facility_id,
lwrnk_facility_desc,
lwrnk_facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
dstn,
iso2_ntn_cd,
lcinfo,
job_execution_id, created_by
dest,
country_iso_two_cd,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -4,6 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.StsOperationEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,25 @@ import java.util.List;
public class StsOperationRepositoryImpl extends BaseJdbcRepository<StsOperationEntity, String>
implements StsOperationRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-006}")
private String tableName;
public StsOperationRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "t_snp_data.t_stsoperation";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -41,26 +54,26 @@ public class StsOperationRepositoryImpl extends BaseJdbcRepository<StsOperationE
public String getInsertSql() {
return """
INSERT INTO %s(
imo,
imo_no,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
facility_id,
facility_nm,
facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
draft,
lat,
lon,
prnt_call_id,
ntn_cd,
ntn_nm,
sts_location,
up_prtcll_id,
country_cd,
country_nm,
sts_position,
sts_type,
evt_start_dt,
lcinfo,
job_execution_id, created_by
event_sta_dt,
position_info,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -4,6 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.TerminalCallsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,25 @@ import java.util.List;
public class TerminalCallsRepositoryImpl extends BaseJdbcRepository<TerminalCallsEntity, String>
implements TerminalCallsRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-007}")
private String tableName;
public TerminalCallsRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "t_snp_data.t_terminalcall";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -39,32 +52,30 @@ public class TerminalCallsRepositoryImpl extends BaseJdbcRepository<TerminalCall
@Override
public String getInsertSql() {
// return """
// INSERT INTO snp_data.t_terminalcall(
return """
INSERT INTO %s(
imo,
imo_no,
mvmn_type,
mvmn_dt,
fclty_id,
fclty_nm,
fclty_type,
up_fclty_id,
up_fclty_nm,
up_fclty_type,
ntn_cd,
ntn_nm,
facility_id,
facility_nm,
facility_type,
up_facility_id,
up_facility_nm,
up_facility_type,
country_cd,
country_nm,
draft,
lat,
lon,
prnt_call_id,
iso2_ntn_cd,
evt_start_dt,
lcinfo,
sub_fclty_id,
sub_fclty_nm,
sub_fclty_type,
job_execution_id, created_by
up_prtcll_id,
country_iso_two_cd,
event_sta_dt,
position_info,
lwrnk_facility_id,
lwrnk_facility_desc,
lwrnk_facility_type,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -4,6 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.movement.batch.entity.TransitsEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,25 @@ import java.util.List;
public class TransitsRepositoryImpl extends BaseJdbcRepository<TransitsEntity, String>
implements TransitsRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.movements-008}")
private String tableName;
public TransitsRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper();
@Override
protected String getTableName() {
return "t_snp_data.t_transit";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -41,13 +54,13 @@ public class TransitsRepositoryImpl extends BaseJdbcRepository<TransitsEntity, S
public String getInsertSql() {
return """
INSERT INTO %s(
imo,
imo_no,
mvmn_type,
mvmn_dt,
fclty_nm,
fclty_type,
facility_nm,
facility_type,
draft,
job_execution_id, created_by
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?, ?, ?, ?);
""".formatted(getTableName());
}

파일 보기

@ -1,35 +0,0 @@
package com.snp.batch.jobs.movement.batch.writer;
import com.snp.batch.common.batch.writer.BaseWriter;
import com.snp.batch.jobs.movement.batch.repository.DarkActivityRepository;
import com.snp.batch.jobs.movement.batch.entity.DarkActivityEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.stereotype.Component;
import java.util.List;
/**
* 선박 상세 정보 Writer
*/
@Slf4j
@Component
public class DarkActivityWriter extends BaseWriter<DarkActivityEntity> {
private final DarkActivityRepository darkActivityRepository;
public DarkActivityWriter(DarkActivityRepository darkActivityRepository) {
super("DarkActivity");
this.darkActivityRepository = darkActivityRepository;
}
@Override
protected void writeItems(List<DarkActivityEntity> items) throws Exception {
if (items.isEmpty()) { return; }
darkActivityRepository.saveAll(items);
log.info("DarkActivity 데이터 저장: {} 건", items.size());
}
}

파일 보기

@ -43,9 +43,13 @@ public class PscInspectionJobConfig extends BaseMultiStepJobConfig<PscInspection
@Value("${app.batch.ship-api.url}")
private String maritimeApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "PSC_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public PscInspectionJobConfig(
JobRepository jobRepository,

파일 보기

@ -1,67 +0,0 @@
package com.snp.batch.jobs.pscInspection.batch.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import lombok.Data;
@Data
public class PscCertificateDto {
@JsonProperty("Type_Id")
private String typeId;
@JsonProperty("DataSetVersion")
private PscDataSetVersionDto dataSetVersion;
@JsonProperty("Certificate_ID")
private String certificateId;
@JsonProperty("Certificate_Title")
private String certificateTitle;
@JsonProperty("Certificate_Title_Code")
private String certificateTitleCode;
@JsonProperty("Class_SOC_Of_Issuer")
private String classSocOfIssuer;
@JsonProperty("Expiry_Date")
private String expiryDate; // ISO 날짜 문자열 그대로 받음
@JsonProperty("Inspection_ID")
private String inspectionId;
@JsonProperty("Issue_Date")
private String issueDate;
@JsonProperty("Issuing_Authority")
private String issuingAuthority;
@JsonProperty("Issuing_Authority_Code")
private String issuingAuthorityCode;
@JsonProperty("Last_Survey_Date")
private String lastSurveyDate;
@JsonProperty("Latest_Survey_Place")
private String latestSurveyPlace;
@JsonProperty("Latest_Survey_Place_Code")
private String latestSurveyPlaceCode;
@JsonProperty("Lrno")
private String lrno;
@JsonProperty("Other_Issuing_Authority")
private String otherIssuingAuthority;
@JsonProperty("Other_Survey_Authority")
private String otherSurveyAuthority;
@JsonProperty("Survey_Authority")
private String surveyAuthority;
@JsonProperty("Survey_Authority_Code")
private String surveyAuthorityCode;
@JsonProperty("Survey_Authority_Type")
private String surveyAuthorityType;
}

파일 보기

@ -114,9 +114,6 @@ public class PscInspectionDto {
@JsonProperty("PSCDefects")
private List<PscDefectDto> pscDefects;
@JsonProperty("PSCCertificates")
private List<PscCertificateDto> pscCertificates;
@JsonProperty("PSCAllCertificates")
private List<PscAllCertificateDto> pscAllCertificates;
}

파일 보기

@ -1,45 +0,0 @@
package com.snp.batch.jobs.pscInspection.batch.entity;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.experimental.SuperBuilder;
import java.time.LocalDateTime;
@Data
@SuperBuilder
@NoArgsConstructor
@AllArgsConstructor
public class PscCertificateEntity {
private String certificateId;
private String typeId;
private String dataSetVersion;
private String certificateTitle;
private String certificateTitleCode;
private String classSocOfIssuer;
private LocalDateTime expiryDate;
private String inspectionId;
private LocalDateTime issueDate;
private String issuingAuthority;
private String issuingAuthorityCode;
private LocalDateTime lastSurveyDate;
private String latestSurveyPlace;
private String latestSurveyPlaceCode;
private String lrno;
private String otherIssuingAuthority;
private String otherSurveyAuthority;
private String surveyAuthority;
private String surveyAuthorityCode;
private String surveyAuthorityType;
}

파일 보기

@ -59,6 +59,5 @@ public class PscInspectionEntity extends BaseEntity {
private String yearOfBuild;
private List<PscDefectEntity> defects;
private List<PscCertificateEntity> certificates;
private List<PscAllCertificateEntity> allCertificates;
}

파일 보기

@ -3,11 +3,9 @@ package com.snp.batch.jobs.pscInspection.batch.processor;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.snp.batch.common.batch.processor.BaseProcessor;
import com.snp.batch.jobs.pscInspection.batch.dto.PscAllCertificateDto;
import com.snp.batch.jobs.pscInspection.batch.dto.PscCertificateDto;
import com.snp.batch.jobs.pscInspection.batch.dto.PscDefectDto;
import com.snp.batch.jobs.pscInspection.batch.dto.PscInspectionDto;
import com.snp.batch.jobs.pscInspection.batch.entity.PscAllCertificateEntity;
import com.snp.batch.jobs.pscInspection.batch.entity.PscCertificateEntity;
import com.snp.batch.jobs.pscInspection.batch.entity.PscDefectEntity;
import com.snp.batch.jobs.pscInspection.batch.entity.PscInspectionEntity;
import lombok.extern.slf4j.Slf4j;
@ -85,7 +83,6 @@ public class PscInspectionProcessor extends BaseProcessor<PscInspectionDto, PscI
// 리스트 null-safe
entity.setDefects(item.getPscDefects() == null ? List.of() : convertDefectDtos(item.getPscDefects()));
entity.setCertificates(item.getPscCertificates() == null ? List.of() : convertCertificateDtos(item.getPscCertificates()));
entity.setAllCertificates(item.getPscAllCertificates() == null ? List.of() : convertAllCertificateDtos(item.getPscAllCertificates()));
@ -198,34 +195,6 @@ public class PscInspectionProcessor extends BaseProcessor<PscInspectionDto, PscI
.build())
.collect(Collectors.toList());
}
private List<PscCertificateEntity> convertCertificateDtos(List<PscCertificateDto> dtos) {
if (dtos == null || dtos.isEmpty()) return List.of();
return dtos.stream()
.map(dto -> PscCertificateEntity.builder()
.certificateId(dto.getCertificateId())
.typeId(dto.getTypeId())
.dataSetVersion(dto.getDataSetVersion() != null ? dto.getDataSetVersion().getDataSetVersion() : null)
.certificateTitle(dto.getCertificateTitle())
.certificateTitleCode(dto.getCertificateTitleCode())
.classSocOfIssuer(dto.getClassSocOfIssuer())
.issueDate(dto.getIssueDate() != null ? parseFlexible(dto.getIssueDate()) : null)
.expiryDate(dto.getExpiryDate() != null ? parseFlexible(dto.getExpiryDate()) : null)
.inspectionId(dto.getInspectionId())
.issuingAuthority(dto.getIssuingAuthority())
.issuingAuthorityCode(dto.getIssuingAuthorityCode())
.lastSurveyDate(dto.getLastSurveyDate() != null ? parseFlexible(dto.getLastSurveyDate()) : null)
.latestSurveyPlace(dto.getLatestSurveyPlace())
.latestSurveyPlaceCode(dto.getLatestSurveyPlaceCode())
.lrno(dto.getLrno())
.otherIssuingAuthority(dto.getOtherIssuingAuthority())
.otherSurveyAuthority(dto.getOtherSurveyAuthority())
.surveyAuthority(dto.getSurveyAuthority())
.surveyAuthorityCode(dto.getSurveyAuthorityCode())
.surveyAuthorityType(dto.getSurveyAuthorityType())
.build())
.collect(Collectors.toList());
}
public static List<PscAllCertificateEntity> convertAllCertificateDtos(List<PscAllCertificateDto> dtos) {
if (dtos == null || dtos.isEmpty()) return List.of();

파일 보기

@ -2,8 +2,8 @@ package com.snp.batch.jobs.pscInspection.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.pscInspection.batch.entity.PscAllCertificateEntity;
import com.snp.batch.jobs.pscInspection.batch.entity.PscCertificateEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -17,13 +17,25 @@ import java.util.List;
@Repository
public class PscAllCertificateRepositoryImpl extends BaseJdbcRepository<PscAllCertificateEntity, String>
implements PscAllCertificateRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.psc-003}")
private String tableName;
public PscAllCertificateRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "new_snp.psc_all_certificate";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -44,28 +56,28 @@ public class PscAllCertificateRepositoryImpl extends BaseJdbcRepository<PscAllCe
@Override
public String getInsertSql() {
return """
INSERT INTO t_snp_data.psc_all_certificate(
certificate_id,
data_set_version,
inspection_id,
lrno,
certificate_title_code,
certificate_title,
issuing_authority_code,
issuing_authority,
other_issuing_authority,
issue_date,
expiry_date,
last_survey_date,
survey_authority_code,
survey_authority,
other_survey_authority,
latest_survey_place,
latest_survey_place_code,
survey_authority_type,
inspection_date,
inspected_by,
job_execution_id, created_by
INSERT INTO %s(""".formatted(getTableName()) + """
cert_id,
dataset_ver,
inspection_id,
imo_no,
certf_nm_cd,
certf_nm,
issue_engines_cd,
issue_engines,
etc_issue_engines,
issue_ymd,
expry_ymd,
last_inspection_ymd,
inspection_engines_cd,
inspection_engines,
etc_inspection_engines,
recent_inspection_plc,
recent_inspection_plc_cd,
inspection_engines_type,
check_ymd,
insptr,
job_execution_id, creatr_id
) VALUES (
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,

파일 보기

@ -1,10 +0,0 @@
package com.snp.batch.jobs.pscInspection.batch.repository;
import com.snp.batch.jobs.pscInspection.batch.entity.PscCertificateEntity;
import com.snp.batch.jobs.pscInspection.batch.entity.PscDefectEntity;
import java.util.List;
public interface PscCertificateRepository {
void saveCertificates(List<PscCertificateEntity> certificates);
}

파일 보기

@ -1,139 +0,0 @@
package com.snp.batch.jobs.pscInspection.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.pscInspection.batch.entity.PscCertificateEntity;
import com.snp.batch.jobs.pscInspection.batch.entity.PscDefectEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
import java.sql.PreparedStatement;
import java.sql.Timestamp;
import java.util.List;
@Slf4j
@Repository
public class PscCertificateRepositoryImpl extends BaseJdbcRepository<PscCertificateEntity, String>
implements PscCertificateRepository {
public PscCertificateRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "new_snp.psc_certificate";
}
@Override
protected RowMapper<PscCertificateEntity> getRowMapper() {
return null;
}
@Override
protected String getEntityName() {
return "PscCertificate";
}
@Override
protected String extractId(PscCertificateEntity entity) {
return entity.getCertificateId();
}
@Override
public String getInsertSql() {
return """
INSERT INTO new_snp.psc_certificate(
certificate_id,
type_id,
data_set_version,
certificate_title,
certificate_title_code,
class_soc_of_issuer,
expiry_date,
inspection_id,
issue_date,
issuing_authority,
issuing_authority_code,
last_survey_date,
latest_survey_place,
latest_survey_place_code,
lrno,
other_issuing_authority,
other_survey_authority,
survey_authority,
survey_authority_code,
survey_authority_type
) VALUES (
?,?,?,?,?,?,?,?,?,?,
?,?,?,?,?,?,?,?,?,?
)
ON CONFLICT (certificate_id)
DO UPDATE SET
type_id = EXCLUDED.type_id,
data_set_version = EXCLUDED.data_set_version,
certificate_title = EXCLUDED.certificate_title,
certificate_title_code = EXCLUDED.certificate_title_code,
class_soc_of_issuer = EXCLUDED.class_soc_of_issuer,
expiry_date = EXCLUDED.expiry_date,
inspection_id = EXCLUDED.inspection_id,
issue_date = EXCLUDED.issue_date,
issuing_authority = EXCLUDED.issuing_authority,
issuing_authority_code = EXCLUDED.issuing_authority_code,
last_survey_date = EXCLUDED.last_survey_date,
latest_survey_place = EXCLUDED.latest_survey_place,
latest_survey_place_code = EXCLUDED.latest_survey_place_code,
lrno = EXCLUDED.lrno,
other_issuing_authority = EXCLUDED.other_issuing_authority,
other_survey_authority = EXCLUDED.other_survey_authority,
survey_authority = EXCLUDED.survey_authority,
survey_authority_code = EXCLUDED.survey_authority_code,
survey_authority_type = EXCLUDED.survey_authority_type
""";
}
@Override
protected String getUpdateSql() {
return null;
}
@Override
protected void setInsertParameters(PreparedStatement ps, PscCertificateEntity e) throws Exception {
int i = 1;
ps.setString(i++, e.getCertificateId());
ps.setString(i++, e.getTypeId());
ps.setString(i++, e.getDataSetVersion());
ps.setString(i++, e.getCertificateTitle());
ps.setString(i++, e.getCertificateTitleCode());
ps.setString(i++, e.getClassSocOfIssuer());
ps.setTimestamp(i++, e.getExpiryDate() != null ? Timestamp.valueOf(e.getExpiryDate()) : null);
ps.setString(i++, e.getInspectionId());
ps.setTimestamp(i++, e.getIssueDate() != null ? Timestamp.valueOf(e.getIssueDate()) : null);
ps.setString(i++, e.getIssuingAuthority());
ps.setString(i++, e.getIssuingAuthorityCode());
ps.setTimestamp(i++, e.getLastSurveyDate() != null ? Timestamp.valueOf(e.getLastSurveyDate()) : null);
ps.setString(i++, e.getLatestSurveyPlace());
ps.setString(i++, e.getLatestSurveyPlaceCode());
ps.setString(i++, e.getLrno());
ps.setString(i++, e.getOtherIssuingAuthority());
ps.setString(i++, e.getOtherSurveyAuthority());
ps.setString(i++, e.getSurveyAuthority());
ps.setString(i++, e.getSurveyAuthorityCode());
ps.setString(i++, e.getSurveyAuthorityType());
}
@Override
protected void setUpdateParameters(PreparedStatement ps, PscCertificateEntity entity) throws Exception {
}
@Override
public void saveCertificates(List<PscCertificateEntity> entities) {
if (entities == null || entities.isEmpty()) return;
// log.info("PSC Certificate 저장 시작 = {}건", entities.size());
batchInsert(entities);
}
}

파일 보기

@ -4,6 +4,7 @@ import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.pscInspection.batch.entity.PscDefectEntity;
import com.snp.batch.jobs.pscInspection.batch.entity.PscInspectionEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -17,13 +18,25 @@ import java.util.List;
@Repository
public class PscDefectRepositoryImpl extends BaseJdbcRepository<PscDefectEntity, String>
implements PscDefectRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.psc-002}")
private String tableName;
public PscDefectRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "new_snp.psc_detail";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -33,7 +46,7 @@ public class PscDefectRepositoryImpl extends BaseJdbcRepository<PscDefectEntity,
@Override
protected String getEntityName() {
return "PscInspection";
return "PscDefect";
}
@Override
@ -44,32 +57,32 @@ public class PscDefectRepositoryImpl extends BaseJdbcRepository<PscDefectEntity,
@Override
public String getInsertSql() {
return """
INSERT INTO t_snp_data.psc_defect(
INSERT INTO %s(""".formatted(getTableName()) + """
defect_id,
inspection_id,
data_set_version,
action_1,
action_2,
action_3,
action_code_1,
action_code_2,
action_code_3,
class_is_responsible,
defect_code,
defect_text,
defective_item_code,
detention_reason_deficiency,
main_defect_code,
main_defect_text,
nature_of_defect_code,
nature_of_defect_decode,
other_action,
other_recognised_org_resp,
recognised_org_resp,
recognised_org_resp_code,
recognised_org_resp_yn,
is_accidental_damage,
job_execution_id, created_by
dataset_ver,
actn_one,
actn_two,
actn_thr,
actn_cd_one,
actn_cd_two,
actn_cd_thr,
clfic_respsb_yn,
defect_cd,
defect_cn,
defect_iem_cd,
detained_reason_defect,
main_defect_cd,
main_defect_cn,
defect_type_cd,
defect_type_nm,
etc_actn,
etc_pubc_engines_respsb,
pubc_engines_respsb,
pubc_engines_respsb_cd,
pubc_engines_respsb_yn,
acdnt_damg_yn,
job_execution_id, creatr_id
) VALUES (
?,?,?,?,?,?,?,?,?,?,
?,?,?,?,?,?,?,?,?,?,

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.pscInspection.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.pscInspection.batch.entity.PscInspectionEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -16,13 +17,25 @@ import java.util.List;
@Repository
public class PscInspectionRepositoryImpl extends BaseJdbcRepository<PscInspectionEntity, String>
implements PscInspectionRepository{
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.psc-001}")
private String tableName;
public PscInspectionRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "new_snp.psc_detail";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -38,38 +51,38 @@ public class PscInspectionRepositoryImpl extends BaseJdbcRepository<PscInspectio
@Override
public String getInsertSql() {
return """
INSERT INTO t_snp_data.psc_detail(
INSERT INTO %s(""".formatted(getTableName()) + """
inspection_id,
data_set_version,
authorisation,
call_sign,
class,
charterer,
dataset_ver,
aprv_type,
clsgn_no,
clfic,
chrter,
country,
inspection_date,
release_date,
ship_detained,
dead_weight,
expanded_inspection,
flag,
follow_up_inspection,
gross_tonnage,
inspection_port_decode,
last_updated,
ihslr_or_imo_ship_no,
manager,
number_of_days_detained,
number_of_defects,
number_of_part_days_detained,
other_inspection_type,
owner,
ship_name,
ship_type_code,
ship_type_decode,
source,
unlocode,
year_of_build,
job_execution_id, created_by
inspection_ymd,
tkoff_prmt_ymd,
ship_detained_yn,
dwt,
expnd_inspection_yn,
flg,
folw_inspection_yn,
total_ton,
inspection_port_nm,
last_mdfcn_dt,
imo_no,
ship_mngr,
detained_days,
defect_cnt,
defect_cnt_days,
etc_inspection_type,
shponr,
ship_nm,
ship_type_cd,
ship_type_nm,
data_src,
un_port_cd,
build_yy,
job_execution_id, creatr_id
) VALUES (
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,

파일 보기

@ -3,7 +3,6 @@ package com.snp.batch.jobs.pscInspection.batch.writer;
import com.snp.batch.common.batch.writer.BaseWriter;
import com.snp.batch.jobs.pscInspection.batch.entity.PscInspectionEntity;
import com.snp.batch.jobs.pscInspection.batch.repository.PscAllCertificateRepository;
import com.snp.batch.jobs.pscInspection.batch.repository.PscCertificateRepository;
import com.snp.batch.jobs.pscInspection.batch.repository.PscDefectRepository;
import com.snp.batch.jobs.pscInspection.batch.repository.PscInspectionRepository;
import lombok.extern.slf4j.Slf4j;
@ -16,17 +15,14 @@ import java.util.List;
public class PscInspectionWriter extends BaseWriter<PscInspectionEntity> {
private final PscInspectionRepository pscInspectionRepository;
private final PscDefectRepository pscDefectRepository;
private final PscCertificateRepository pscCertificateRepository;
private final PscAllCertificateRepository pscAllCertificateRepository;
public PscInspectionWriter(PscInspectionRepository pscInspectionRepository,
PscDefectRepository pscDefectRepository,
PscCertificateRepository pscCertificateRepository,
PscAllCertificateRepository pscAllCertificateRepository) {
super("PscInspection");
this.pscInspectionRepository = pscInspectionRepository;
this.pscDefectRepository = pscDefectRepository;
this.pscCertificateRepository = pscCertificateRepository;
this.pscAllCertificateRepository = pscAllCertificateRepository;
}
@ -40,16 +36,14 @@ public class PscInspectionWriter extends BaseWriter<PscInspectionEntity> {
for (PscInspectionEntity entity : items) {
pscInspectionRepository.saveAll(List.of(entity));
pscDefectRepository.saveDefects(entity.getDefects());
pscCertificateRepository.saveCertificates(entity.getCertificates());
pscAllCertificateRepository.saveAllCertificates(entity.getAllCertificates());
// 효율적으로 로그
int defectCount = entity.getDefects() != null ? entity.getDefects().size() : 0;
int certificateCount = entity.getCertificates() != null ? entity.getCertificates().size() : 0;
int allCertificateCount = entity.getAllCertificates() != null ? entity.getAllCertificates().size() : 0;
log.info("Inspection ID: {}, Defects: {}, Certificates: {}, AllCertificates: {}",
entity.getInspectionId(), defectCount, certificateCount, allCertificateCount);
log.info("Inspection ID: {}, Defects: {}, AllCertificates: {}",
entity.getInspectionId(), defectCount, allCertificateCount);
}
}
}

파일 보기

@ -14,6 +14,7 @@ import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
@ -30,6 +31,9 @@ public class RiskImportJobConfig extends BaseJobConfig<RiskDto, RiskEntity> {
private final RiskDataWriter riskDataWriter;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Override
protected int getChunkSize() {
return 5000; // API에서 5000개씩 가져오므로 chunk도 5000으로 설정
@ -60,7 +64,7 @@ public class RiskImportJobConfig extends BaseJobConfig<RiskDto, RiskEntity> {
@Override
protected ItemReader<RiskDto> createReader() {
return new RiskDataReader(maritimeServiceApiWebClient, jdbcTemplate);
return new RiskDataReader(maritimeServiceApiWebClient, jdbcTemplate, targetSchema);
}
@Override

파일 보기

@ -42,9 +42,12 @@ public class RiskImportRangeJobConfig extends BaseMultiStepJobConfig<RiskDto, Ri
@Value("${app.batch.webservice-api.url}")
private String maritimeServiceApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "RISK_IMPORT_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
@Override

파일 보기

@ -19,14 +19,16 @@ public class RiskDataReader extends BaseApiReader<RiskDto> {
// 3. Response Data -> Core20에 업데이트 (Chunk 단위로 반복)
private final JdbcTemplate jdbcTemplate;
private final String targetSchema;
private List<String> allImoNumbers;
private int currentBatchIndex = 0;
private final int batchSize = 100;
public RiskDataReader(WebClient webClient, JdbcTemplate jdbcTemplate) {
public RiskDataReader(WebClient webClient, JdbcTemplate jdbcTemplate, String targetSchema) {
super(webClient);
this.jdbcTemplate = jdbcTemplate;
this.targetSchema = targetSchema;
enableChunkMode(); // Chunk 모드 활성화
}
@ -46,20 +48,18 @@ public class RiskDataReader extends BaseApiReader<RiskDto> {
return "/RiskAndCompliance/RisksByImos";
}
// private String getTargetTable(){
// return "snp_data.core20";
// }
private String getTargetTable(){
return "snp_data.ship_data";
return targetSchema + ".ship_data";
}
private String getImoQuery() {
return "select imo_number as ihslrorimoshipno from " + getTargetTable() + " order by imo_number";
}
private String GET_CORE_IMO_LIST =
// "SELECT ihslrorimoshipno FROM " + getTargetTable() + " ORDER BY ihslrorimoshipno";
"select imo_number as ihslrorimoshipno from snp_data.ship_data order by imo_number";
@Override
protected void beforeFetch(){
log.info("[{}] Core20 테이블에서 IMO 번호 조회 시작...", getReaderName());
allImoNumbers = jdbcTemplate.queryForList(GET_CORE_IMO_LIST, String.class);
allImoNumbers = jdbcTemplate.queryForList(getImoQuery(), String.class);
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.risk.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.risk.batch.entity.RiskEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -15,13 +16,24 @@ import java.util.List;
@Repository("riskRepository")
public class RiskRepositoryImpl extends BaseJdbcRepository<RiskEntity, Long> implements RiskRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.risk-compliance-001}")
private String tableName;
public RiskRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "t_snp_data.risk";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -43,26 +55,26 @@ public class RiskRepositoryImpl extends BaseJdbcRepository<RiskEntity, Long> imp
protected String getUpdateSql() {
return """
INSERT INTO %s(
lrno, lastupdated,
riskdatamaintained, dayssincelastseenonais, daysunderais, imocorrectonais, sailingundername,
anomalousmessagesfrommmsi, mostrecentdarkactivity, portcalls, portrisk, stsoperations,
driftinghighseas, riskevents, flagchanges, flagparismouperformance, flagtokyomoupeformance,
flaguscgmouperformance, uscgqualship21, timesincepscinspection, pscinspections, pscdefects,
pscdetentions, currentsmccertificate, docchanges, currentclass, classstatuschanges,
pandicoverage, namechanges, gbochanges, ageofship, iuufishingviolation,
draughtchanges, mostrecentsanctionedportcall, singleshipoperation, fleetsafety, fleetpsc,
specialsurveyoverdue, ownerunknown, russianportcall, russianownerregistration, russiansts,
job_execution_id, created_by
imo_no, last_mdfcn_dt,
risk_data_maint, ais_notrcv_elps_days, ais_lwrnk_days, ais_up_imo_desc, othr_ship_nm_voy_yn,
mmsi_anom_message, recent_dark_actv, port_prtcll, port_risk, sts_job,
drift_chg, risk_event, ntnlty_chg, ntnlty_prs_mou_perf, ntnlty_tky_mou_perf,
ntnlty_uscg_mou_perf, uscg_excl_ship_cert, psc_inspection_elps_hr, psc_inspection, psc_defect,
psc_detained, now_smgrc_evdc, docc_chg, now_clfic, clfic_status_chg,
pni_insrnc, ship_nm_chg, gbo_chg, vslage, ilgl_fshr_viol,
draft_chg, recent_sanction_prtcll, sngl_ship_voy, fltsfty, flt_psc,
spc_inspection_ovdue, ownr_unk, rss_port_call, rss_ownr_reg, rss_sts,
job_execution_id, creatr_id
)
VALUES (
?, ?::timestamptz,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?::timestamptz,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?
);

파일 보기

@ -15,6 +15,7 @@ import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
@ -51,7 +52,10 @@ public class ShipDetailImportJobConfig extends BaseJobConfig<ShipDetailDto, Ship
private final ShipDetailDataWriter shipDetailDataWriter;
private final JdbcTemplate jdbcTemplate;
private final WebClient maritimeApiWebClient;
private final ObjectMapper objectMapper; // ObjectMapper 주입 추가
private final ObjectMapper objectMapper;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
public ShipDetailImportJobConfig(
JobRepository jobRepository,
@ -60,13 +64,13 @@ public class ShipDetailImportJobConfig extends BaseJobConfig<ShipDetailDto, Ship
ShipDetailDataWriter shipDetailDataWriter,
JdbcTemplate jdbcTemplate,
@Qualifier("maritimeApiWebClient") WebClient maritimeApiWebClient,
ObjectMapper objectMapper) { // ObjectMapper 주입 추가
ObjectMapper objectMapper) {
super(jobRepository, transactionManager);
this.shipDetailDataProcessor = shipDetailDataProcessor;
this.shipDetailDataWriter = shipDetailDataWriter;
this.jdbcTemplate = jdbcTemplate;
this.maritimeApiWebClient = maritimeApiWebClient;
this.objectMapper = objectMapper; // ObjectMapper 초기화
this.objectMapper = objectMapper;
}
@Override
@ -80,8 +84,8 @@ public class ShipDetailImportJobConfig extends BaseJobConfig<ShipDetailDto, Ship
}
@Override
protected ItemReader<ShipDetailDto> createReader() { // 타입 변경
return new ShipDetailDataReader(maritimeApiWebClient, jdbcTemplate, objectMapper);
protected ItemReader<ShipDetailDto> createReader() {
return new ShipDetailDataReader(maritimeApiWebClient, jdbcTemplate, objectMapper, targetSchema);
}
@Override

파일 보기

@ -13,6 +13,8 @@ import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.beans.factory.annotation.Value;
import java.util.Arrays;
import java.util.List;
@ -23,6 +25,9 @@ public class ShipDetailSyncJobConfig {
private final PlatformTransactionManager transactionManager;
private final JdbcTemplate jdbcTemplate;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
// API 정의 (배치 로그 관리용)
protected String getApiKey() {
return "SHIP_DETAIL_SYNC_API";
@ -31,8 +36,8 @@ public class ShipDetailSyncJobConfig {
// 마지막 실행 일자 업데이트 SQL
protected String getBatchUpdateSql() {
return String.format(
"UPDATE SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'",
getApiKey()
"UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'",
targetSchema, getApiKey()
);
}
@ -80,7 +85,8 @@ public class ShipDetailSyncJobConfig {
log.info(">>>>> SHIP MASTER & CORE20 동기화 프로시저 호출 시작");
// PostgreSQL 기준 프로시저 호출 (CALL)
jdbcTemplate.execute("CALL snp_data.proc_sync_ship_master_and_core()");
String procedureCall = String.format("CALL %s.proc_sync_ship_master_and_core()", targetSchema);
jdbcTemplate.execute(procedureCall);
log.info(">>>>> SHIP MASTER & CORE20 동기화 프로시저 호출 완료");
return RepeatStatus.FINISHED;
@ -106,7 +112,8 @@ public class ShipDetailSyncJobConfig {
try {
log.info("테이블 동기화 중: {}", tableName);
// 이전에 생성한 동적 프로시저 호출
jdbcTemplate.execute("CALL snp_data.proc_sync_ship_detail('" + tableName + "')");
String procedureCall = String.format("CALL %s.proc_sync_ship_detail('%s')", targetSchema, tableName);
jdbcTemplate.execute(procedureCall);
} catch (Exception e) {
log.error("테이블 동기화 실패: {}. 에러: {}", tableName, e.getMessage());
// 특정 테이블 실패 중단할지, 계속 진행할지에 따라 throw 여부 결정

파일 보기

@ -44,9 +44,13 @@ public class ShipDetailUpdateJobConfig extends BaseMultiStepJobConfig<ShipDetail
@Value("${app.batch.ship-api.url}")
private String maritimeApiUrl;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
protected String getApiKey() {return "SHIP_DETAIL_UPDATE_API";}
protected String getBatchUpdateSql() {
return String.format("UPDATE T_SNP_DATA.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", getApiKey());}
return String.format("UPDATE %s.BATCH_LAST_EXECUTION SET LAST_SUCCESS_DATE = NOW(), UPDATED_AT = NOW() WHERE API_KEY = '%s'", targetSchema, getApiKey());}
public ShipDetailUpdateJobConfig(

파일 보기

@ -14,11 +14,13 @@ import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.web.reactive.function.client.WebClient;
@Slf4j
@Configuration
public class ShipLastPositionUpdateJobConfig extends BaseJobConfig<TargetEnhancedDto, TargetEnhancedEntity> {
@ -29,6 +31,9 @@ public class ShipLastPositionUpdateJobConfig extends BaseJobConfig<TargetEnhance
private final ShipLastPositionDataWriter shipLastPositionDataWriter;
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Override
protected int getChunkSize() {
return 5000; // API에서 5000개씩 가져오므로 chunk도 5000으로 설정
@ -59,7 +64,7 @@ public class ShipLastPositionUpdateJobConfig extends BaseJobConfig<TargetEnhance
@Override
protected ItemReader<TargetEnhancedDto> createReader() {
return new ShipLastPositionDataReader(maritimeAisApiWebClient, jdbcTemplate);
return new ShipLastPositionDataReader(maritimeAisApiWebClient, jdbcTemplate, targetSchema);
}
@Override

파일 보기

@ -39,16 +39,18 @@ public class ShipDetailDataReader extends BaseApiReader<ShipDetailDto> {
private final JdbcTemplate jdbcTemplate;
private final ObjectMapper objectMapper;
private final String targetSchema;
// 배치 처리 상태
private List<String> allImoNumbers;
private int currentBatchIndex = 0;
private final int batchSize = 30;
public ShipDetailDataReader(WebClient webClient, JdbcTemplate jdbcTemplate, ObjectMapper objectMapper) {
public ShipDetailDataReader(WebClient webClient, JdbcTemplate jdbcTemplate, ObjectMapper objectMapper, String targetSchema) {
super(webClient);
this.jdbcTemplate = jdbcTemplate;
this.objectMapper = objectMapper;
this.targetSchema = targetSchema;
enableChunkMode(); // Chunk 모드 활성화
}
@ -68,8 +70,12 @@ public class ShipDetailDataReader extends BaseApiReader<ShipDetailDto> {
return "/MaritimeWCF/APSShipService.svc/RESTFul/GetShipsByIHSLRorIMONumbersAll";
}
private static final String GET_ALL_IMO_QUERY =
"select imo_number from t_snp_data.ship_data order by imo_number";
/**
* IMO 번호 조회 쿼리 생성 (스키마 동적 적용)
*/
private String getImoQuery() {
return "select imo_no from " + targetSchema + ".tb_ship_default_info order by imo_no";
}
/**
@ -80,7 +86,7 @@ public class ShipDetailDataReader extends BaseApiReader<ShipDetailDto> {
// Step 1. IMO 전체 번호 조회
log.info("[{}] ship_data 테이블에서 IMO 번호 조회 시작...", getReaderName());
allImoNumbers = jdbcTemplate.queryForList(GET_ALL_IMO_QUERY, String.class);
allImoNumbers = jdbcTemplate.queryForList(getImoQuery(), String.class);
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);
log.info("[{}] 총 {} 개의 IMO 번호 조회 완료", getReaderName(), allImoNumbers.size());

파일 보기

@ -17,14 +17,16 @@ public class ShipLastPositionDataReader extends BaseApiReader<TargetEnhancedDto>
// 3. Response Data -> Core20에 업데이트 (Chunk 단위로 반복)
private final JdbcTemplate jdbcTemplate;
private final String targetSchema;
private List<String> allImoNumbers;
private int currentBatchIndex = 0;
private final int batchSize = 5000;
public ShipLastPositionDataReader(WebClient webClient, JdbcTemplate jdbcTemplate) {
public ShipLastPositionDataReader(WebClient webClient, JdbcTemplate jdbcTemplate, String targetSchema) {
super(webClient);
this.jdbcTemplate = jdbcTemplate;
this.targetSchema = targetSchema;
enableChunkMode(); // Chunk 모드 활성화
}
@ -45,16 +47,18 @@ public class ShipLastPositionDataReader extends BaseApiReader<TargetEnhancedDto>
}
private String getTargetTable(){
return "new_snp.core20";
return targetSchema + ".core20";
}
private String getImoQuery() {
return "SELECT lrno FROM " + getTargetTable() + " ORDER BY lrno";
}
private String GET_CORE_IMO_LIST =
"SELECT lrno FROM " + getTargetTable() + " ORDER BY lrno";
@Override
protected void beforeFetch(){
log.info("[{}] Core20 테이블에서 IMO 번호 조회 시작...", getReaderName());
allImoNumbers = jdbcTemplate.queryForList(GET_CORE_IMO_LIST, String.class);
allImoNumbers = jdbcTemplate.queryForList(getImoQuery(), String.class);
int totalBatches = (int) Math.ceil((double) allImoNumbers.size() / batchSize);

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.shipdetail.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.shipdetail.batch.entity.*;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -21,13 +22,24 @@ import java.util.*;
public class ShipDetailRepositoryImpl extends BaseJdbcRepository<ShipDetailEntity, String>
implements ShipDetailRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.ship-002}")
private String tableName;
public ShipDetailRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "t_snp_data.ship_detail_data";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -44,28 +56,28 @@ public class ShipDetailRepositoryImpl extends BaseJdbcRepository<ShipDetailEntit
protected String getInsertSql() {
return """
INSERT INTO %s(
ihslrorimoshipno, maritimemobileserviceidentitymmsinumber, shipname,
callsign, flagname, portofregistry, classificationsociety, shiptypelevel5,
shiptypelevel5subtype, yearofbuild, shipbuilder, lengthoverallloa, breadthmoulded,
"depth", draught, grosstonnage, deadweight, teu,
mainenginetype, shipstatus, operator, flagcode, shiptypelevel2,
officialnumber, fishingnumber, classnarrative,
alterationsdescriptivenarrative, shiptypegroup, shiptypelevel3, shiptypelevel4, shiptypelevel5hulltype,
shiptypelevel5subgroup, constructiondescriptivenarrative, dateofbuild,
shipbuilderfullstyle, yardnumber, consumptionspeed1, consumptionvalue1, consumptionspeed2,
consumptionvalue2, totalbunkercapacity, boilermanufacturer, propellermanufacturer,
lengthregistered, breadthextreme, keeltomastheight, displacement, lengthbetweenperpendicularslbp,
bulbousbow, tonnespercentimetreimmersiontpci, tonnageeffectivedate, formuladwt, nettonnage,
compensatedgrosstonnagecgt, lightdisplacementtonnage, graincapacity, balecapacity, liquidcapacity,
gascapacity, teucapacity14thomogenous, insulatedcapacity, passengercapacity, bollardpull,
cargocapacitiesnarrative, geardescriptivenarrative, holdsdescriptivenarrative, hatchesdescriptivenarrative,
lanesdoorsrampsnarrative, specialisttankernarrative, tanksdescriptivenarrative,
primemoverdescriptivenarrative, primemoverdescriptiveoverviewnarrative,
auxiliaryenginesnarrative, auxiliarygeneratorsdescriptivenarrative, bunkersdescriptivenarrative,
lastupdatedate,
documentofcompliancedoccompanycode, groupbeneficialownercompanycode, operatorcompanycode, shipmanagercompanycode, technicalmanagercode, registeredownercode,
datasetversion, speedservice,
job_execution_id, created_by
imo_no, mmsi_no, ship_nm,
clsgn_no, ship_ntnlty, load_port, clfic, ship_type_lv_five,
ship_type_lv_five_dtld_type, build_yy, shpyrd, whlnth_loa, formn_breadth,
depth, draft, gt, dwt, teu_cnt,
main_engine_type, ship_status, operator, ntnlty_cd, ship_type_lv_two,
frmla_reg_no, fshr_prmt_no, clfic_desc,
modf_hstry_desc, ship_type_group, ship_type_lv_thr, ship_type_lv_four, ship_type_lv_five_hull_type,
ship_type_lv_five_lwrnk_group, build_desc, build_ymd,
shpyrd_offcl_nm, shpyrd_build_no, fuel_cnsmp_spd_one, fuel_cnsmpamt_val_one, fuel_cnsmp_spd_two,
fuel_cnsmpamt_val_two, total_fuel_capacity_m3, blr_mftr, proplr_mftr,
reg_length, max_breadth, keel_mast_hg, displacement, lbp,
bulb_bow, fldng_one_cm_per_ton_tpci, ton_efect_day, calcfrm_dwt, nt_ton,
cgt, light_displacement_ton, grain_capacity_m3, bale_capacity, liquid_capacity,
gas_m3, teu_capacity, insulated_m3, passenger_capacity, bollard_pull,
cargo_capacity_m3_desc, eqpmnt_desc, hdn, hatche_desc,
lane_door_ramp_desc, spc_tank_desc, tank_desc,
prmovr_desc, prmovr_ovrvw_desc,
aux_desc, asst_gnrtr_desc, fuel_desc,
last_mdfcn_dt,
doc_company_cd, group_actl_ownr_company_cd, operator_company_cd, ship_mngr_company_cd, tech_mngr_cd, reg_shponr_cd,
dataset_ver, svc_spd,
job_execution_id, creatr_id
) VALUES (
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
@ -923,7 +935,7 @@ public class ShipDetailRepositoryImpl extends BaseJdbcRepository<ShipDetailEntit
}
public boolean existsByImo(String imo) {
String sql = String.format("SELECT COUNT(*) FROM %s WHERE %s = ?", getTableName(), getIdColumnName("ihslrorimoshipno"));
String sql = String.format("SELECT COUNT(*) FROM %s WHERE %s = ?", getTableName(), getIdColumnName("imo_no"));
Long count = jdbcTemplate.queryForObject(sql, Long.class, imo);
return count != null && count > 0;
}
@ -932,7 +944,7 @@ public class ShipDetailRepositoryImpl extends BaseJdbcRepository<ShipDetailEntit
@Override
public void delete(String id) {
String sql = "DELETE FROM " + getTableName() + " WHERE ihslrorimoshipno = ?";
String sql = "DELETE FROM " + getTableName() + " WHERE imo_no = ?";
jdbcTemplate.update(sql, id);
log.debug("[{}] 삭제 완료: id={}", getEntityName(), id);
}

파일 보기

@ -1,372 +1,541 @@
package com.snp.batch.jobs.shipdetail.batch.repository;
import lombok.Setter;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
/**
* ShipDetail 관련 SQL 생성 클래스
* application.yml의 app.batch.target-schema.name 값을 사용
*/
@Component
public class ShipDetailSql {
public static final String TARGET_SCHEMA = "t_snp_data";
private static String targetSchema;
private static String ownerhistoryTable;
private static String crewlistTable;
private static String stowagecommodityTable;
private static String groupbeneficialownerhistoryTable;
private static String shipmanagerhistoryTable;
private static String operatorhistoryTable;
private static String technicalmanagerhistoryTable;
private static String bareboatcharterhistoryTable;
private static String namehistoryTable;
private static String flaghistoryTable;
private static String additionalshipsdataTable;
private static String pandihistoryTable;
private static String callsignandmmsihistoryTable;
private static String iceclassTable;
private static String safetymanagementcertificatehistTable;
private static String classhistoryTable;
private static String surveydatesTable;
private static String surveydateshistoryuniqueTable;
private static String sistershiplinksTable;
private static String statushistoryTable;
private static String specialfeatureTable;
private static String thrustersTable;
private static String companyvesselrelationshipsTable;
private static String darkactivityconfirmedTable;
private static String companyDetailTable;
@Value("${app.batch.target-schema.name}")
public void setTargetSchema(String schema) {
ShipDetailSql.targetSchema = schema;
}
@Value("${app.batch.target-schema.tables.ship-015}")
public void setOwnerhistoryTable(String table) {
ShipDetailSql.ownerhistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-008}")
public void setCrewlistTable(String table) {
ShipDetailSql.crewlistTable = table;
}
@Value("${app.batch.target-schema.tables.ship-022}")
public void setStowagecommodityTable(String table) {
ShipDetailSql.stowagecommodityTable = table;
}
@Value("${app.batch.target-schema.tables.ship-011}")
public void setGroupbeneficialownerhistoryTable(String table) {
ShipDetailSql.groupbeneficialownerhistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-018}")
public void setShipmanagerhistoryTable(String table) {
ShipDetailSql.shipmanagerhistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-014}")
public void setOperatorhistoryTable(String table) {
ShipDetailSql.operatorhistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-025}")
public void setTechnicalmanagerhistoryTable(String table) {
ShipDetailSql.technicalmanagerhistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-004}")
public void setBareboatcharterhistoryTable(String table) {
ShipDetailSql.bareboatcharterhistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-013}")
public void setNamehistoryTable(String table) {
ShipDetailSql.namehistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-010}")
public void setFlaghistoryTable(String table) {
ShipDetailSql.flaghistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-003}")
public void setAdditionalshipsdataTable(String table) {
ShipDetailSql.additionalshipsdataTable = table;
}
@Value("${app.batch.target-schema.tables.ship-016}")
public void setPandihistoryTable(String table) {
ShipDetailSql.pandihistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-005}")
public void setCallsignandmmsihistoryTable(String table) {
ShipDetailSql.callsignandmmsihistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-012}")
public void setIceclassTable(String table) {
ShipDetailSql.iceclassTable = table;
}
@Value("${app.batch.target-schema.tables.ship-017}")
public void setSafetymanagementcertificatehistTable(String table) {
ShipDetailSql.safetymanagementcertificatehistTable = table;
}
@Value("${app.batch.target-schema.tables.ship-006}")
public void setClasshistoryTable(String table) {
ShipDetailSql.classhistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-023}")
public void setSurveydatesTable(String table) {
ShipDetailSql.surveydatesTable = table;
}
@Value("${app.batch.target-schema.tables.ship-024}")
public void setSurveydateshistoryuniqueTable(String table) {
ShipDetailSql.surveydateshistoryuniqueTable = table;
}
@Value("${app.batch.target-schema.tables.ship-019}")
public void setSistershiplinksTable(String table) {
ShipDetailSql.sistershiplinksTable = table;
}
@Value("${app.batch.target-schema.tables.ship-021}")
public void setStatushistoryTable(String table) {
ShipDetailSql.statushistoryTable = table;
}
@Value("${app.batch.target-schema.tables.ship-020}")
public void setSpecialfeatureTable(String table) {
ShipDetailSql.specialfeatureTable = table;
}
@Value("${app.batch.target-schema.tables.ship-026}")
public void setThrustersTable(String table) {
ShipDetailSql.thrustersTable = table;
}
@Value("${app.batch.target-schema.tables.ship-007}")
public void setCompanyvesselrelationshipsTable(String table) {
ShipDetailSql.companyvesselrelationshipsTable = table;
}
@Value("${app.batch.target-schema.tables.ship-009}")
public void setDarkactivityconfirmedTable(String table) {
ShipDetailSql.darkactivityconfirmedTable = table;
}
@Value("${app.batch.target-schema.tables.company-001}")
public void setCompanyDetailTable(String table) {
ShipDetailSql.companyDetailTable = table;
}
public static String getTargetSchema() {
return targetSchema;
}
public static String getOwnerHistorySql(){
return """
INSERT INTO %s.ownerhistory(
datasetversion, companystatus, effectivedate, lrno, "owner",
ownercode, "sequence",
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, company_status, efect_sta_day, imo_no, ownr,
ownr_cd, ship_ownr_hstry_seq,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, ownerhistoryTable);
}
public static String getCrewListSql(){
return """
INSERT INTO %s.crewlist(
datasetversion, id, lrno, shipname, crewlistdate,
nationality, totalcrew, totalratings, totalofficers, totalcadets,
totaltrainees, totalridingsquad, totalundeclared,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, crew_id, imo_no, ship_nm, crew_rstr_ymd,
ntnlty, oa_crew_cnt, gen_crew_cnt, offcr_cnt, appr_offcr_cnt,
trne_cnt, embrk_mntnc_crew_cnt, unrprt_cnt,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, crewlistTable);
}
public static String getStowageCommoditySql(){
return """
INSERT INTO %s.stowagecommodity(
datasetversion, commoditycode, commoditydecode, lrno, "sequence",
stowagecode, stowagedecode,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, cargo_cd, cargo_nm, imo_no, ship_cargo_capacity_seq,
capacity_cd, capacity_cd_desc,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, stowagecommodityTable);
}
public static String getGroupBeneficialOwnerHistorySql(){
return """
INSERT INTO %s.groupbeneficialownerhistory(
datasetversion, companystatus, effectivedate, groupbeneficialowner, groupbeneficialownercode,
lrno, "sequence",
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, company_status, efect_sta_day, group_actl_ownr, group_actl_ownr_cd,
imo_no, ship_group_revn_ownr_hstry_seq,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, groupbeneficialownerhistoryTable);
}
public static String getShipManagerHistorySql(){
return """
INSERT INTO %s.shipmanagerhistory(
datasetversion, companystatus, effectivedate, lrno, "sequence",
shipmanager, shipmanagercode,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, company_status, efect_sta_day, imo_no, ship_mng_company_seq,
ship_mngr, ship_mngr_cd,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, shipmanagerhistoryTable);
}
public static String getOperatorHistorySql(){
return """
INSERT INTO %s.operatorhistory(
datasetversion, companystatus, effectivedate, lrno, "operator",
operatorcode, "sequence",
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, company_status, efect_sta_day, imo_no, ship_operator,
ship_operator_cd, ship_operator_hstry_seq,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, operatorhistoryTable);
}
public static String getTechnicalManagerHistorySql(){
return """
INSERT INTO %s.technicalmanagerhistory(
datasetversion, companystatus, effectivedate, lrno, "sequence",
technicalmanager, technicalmanagercode,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, company_status, efect_sta_day, imo_no, ship_tech_mng_company_seq,
tech_mngr, tech_mngr_cd,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, technicalmanagerhistoryTable);
}
public static String getBareBoatCharterHistorySql(){
return """
INSERT INTO %s.bareboatcharterhistory(
datasetversion, lrno, "sequence", effectivedate, bbcharterercode,
bbcharterer,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, bbctr_seq, efect_sta_day, bbctr_company_cd,
bbctr_company,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?,
?, ?, ?, ?, ?,
?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, bareboatcharterhistoryTable);
}
public static String getNameHistorySql(){
return """
INSERT INTO %s.namehistory(
datasetversion, effectivedate, lrno, "sequence", vesselname,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, efect_sta_day, imo_no, ship_nm_chg_hstry_seq, ship_nm,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, namehistoryTable);
}
public static String getFlagHistorySql(){
return """
INSERT INTO %s.flaghistory(
datasetversion, effectivedate, flag, flagcode, lrno,
"sequence",
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, efect_sta_day, country, country_cd, imo_no,
ship_country_hstry_seq,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?,
?, ?, ?, ?, ?,
?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, flaghistoryTable);
}
public static String getAdditionalInformationSql(){
return """
INSERT INTO %s.additionalshipsdata(
lrno, shipemail, waterdepthmax, drilldepthmax, drillbargeind,
productionvesselind, deckheatexchangerind, deckheatexchangermaterial, tweendeckportable, tweendeckfixed,
satcomid, satcomansback, datasetversion,
job_execution_id, created_by
INSERT INTO %s.%s(
imo_no, ship_eml, max_dpwt, max_drill_depth, drill_brg,
ocean_prod_facility, deck_heat_exch, dehtex_matral, portbl_twin_deck, fixed_twin_deck,
ship_satlit_comm_id, ship_satlit_cmrsp_cd, dataset_ver,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, additionalshipsdataTable);
}
public static String getPandIHistorySql(){
return """
INSERT INTO %s.pandihistory(
datasetversion, lrno, "sequence", pandiclubcode, pandiclubdecode,
effectivedate, "source",
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, ship_prtc_rpn_hstry_seq, pni_club_cd, pni_club_nm,
efect_sta_day, src,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, pandihistoryTable);
}
public static String getCallSignAndMmsiHistorySql(){
return """
INSERT INTO %s.callsignandmmsihistory(
datasetversion, lrno, "sequence", callsign, mmsi,
effectivedate,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, ship_idntf_seq, clsgn_no, mmsi_no,
efect_sta_day,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?,
?, ?, ?, ?, ?,
?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, callsignandmmsihistoryTable);
}
public static String getIceClassSql(){
return """
INSERT INTO %s.iceclass(
datasetversion, iceclass, iceclasscode, lrno,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, ice_grd, ice_grd_cd, imo_no,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?,
?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, iceclassTable);
}
public static String getSafetyManagementCertificateHistorySql(){
return """
INSERT INTO %s.safetymanagementcertificatehist(
datasetversion, lrno, safetymanagementcertificateauditor, safetymanagementcertificateconventionorvol, safetymanagementcertificatedateexpires,
safetymanagementcertificatedateissued, safetymanagementcertificatedoccompany, safetymanagementcertificateflag, safetymanagementcertificateissuer, safetymanagementcertificateotherdescription,
safetymanagementcertificateshipname, safetymanagementcertificateshiptype, safetymanagementcertificatesource, safetymanagementcertificatecompanycode, "sequence",
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, smgrc_srng_engines, smgrc_sys_cat_conv_arbt, smgrc_expry_day,
smgrc_issue_day, smgrc_docc_company, smgrc_ntnlty, smgrc_issue_engines, smgrc_etc_desc,
smgrc_ship_nm, smgrc_ship_type, smgrc_src, smgrc_company_cd, ship_sfty_mng_evdc_seq,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, safetymanagementcertificatehistTable);
}
public static String getClassHistorySql(){
return """
INSERT INTO %s.classhistory(
datasetversion, "class", classcode, classindicator, currentindicator,
effectivedate, lrno, "sequence", classid,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, clfic_asctn_nm, clfic_cd, clfic_has_yn, now_yn,
efect_sta_day, imo_no, clfic_hstry_seq, clfic_id,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, classhistoryTable);
}
public static String getSurveyDatesHistorySql(){
return """
INSERT INTO %s.surveydates(
datasetversion, classsociety, classsocietycode, dockingsurvey, lrno,
specialsurvey, annualsurvey, continuousmachinerysurvey, tailshaftsurvey,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, clfic, clfic_cd, dckng_inspection, imo_no,
fxtm_inspection, annual_inspection, mchn_fxtm_inspection_ymd, tlsft_inspection_ymd,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, surveydatesTable);
}
public static String getSurveyDatesHistoryUniqueSql(){
return """
INSERT INTO %s.surveydateshistoryunique(
datasetversion, lrno, classsocietycode, surveydate, surveytype,
classsociety,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, clfic_cd, inspection_ymd, inspection_type,
clfic,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?,
?, ?, ?, ?, ?,
?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, surveydateshistoryuniqueTable);
}
public static String getSisterShipLinksSql(){
return """
INSERT INTO %s.sistershiplinks(
datasetversion, lrno, linkedlrno, job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, link_imo_no, job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, sistershiplinksTable);
}
public static String getStatusHistorySql(){
return """
INSERT INTO %s.statushistory(
datasetversion, lrno, "sequence", status, statuscode, statusdate,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, ship_status_hstry_seq, status, status_cd, status_chg_ymd,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, statushistoryTable);
}
public static String getSpecialFeatureSql(){
return """
INSERT INTO %s.specialfeature(
datasetversion, lrno, "sequence", specialfeature, specialfeaturecode,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, ship_spc_fetr_seq, spc_mttr, spc_mttr_cd,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, specialfeatureTable);
}
public static String getThrustersSql(){
return """
INSERT INTO %s.thrusters(
datasetversion, lrno, "sequence", thrustertype, thrustertypecode,
numberofthrusters, thrusterposition, thrusterbhp, thrusterkw, typeofinstallation,
job_execution_id, created_by
INSERT INTO %s.%s(
dataset_ver, imo_no, thrstr_seq, thrstr_type, thrstr_type_cd,
thrstr_cnt, thrstr_position, thrstr_power_bhp, thrstr_power_kw, instl_mth,
job_execution_id, creatr_id
)VALUES(
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, thrustersTable);
}
public static String getCompanyVesselRelationshipSql(){
return """
INSERT INTO %s.companyvesselrelationships (
datasetversion, doccode, doccompany, groupbeneficialowner, groupbeneficialownercode,
lrno, "operator", operatorcode, registeredowner, registeredownercode,
shipmanager, shipmanagercode, technicalmanager, technicalmanagercode, docgroup,
docgroupcode, operatorgroup, operatorgroupcode, shipmanagergroup, shipmanagergroupcode,
technicalmanagergroup, technicalmanagergroupcode,
job_execution_id, created_by
INSERT INTO %s.%s (
dataset_ver, docc_has_company_cd, docc_has_company, group_actl_ownr, group_actl_ownr_cd,
imo_no, ship_operator, ship_operator_cd, rg_ownr, rg_ownr_cd,
ship_mng_company, ship_mng_company_cd, tech_mng_company, tech_mng_company_cd, docc_group,
docc_group_cd, ship_operator_group, ship_operator_group_cd, ship_mng_company_group, ship_mng_company_group_cd,
tech_mng_company_group, tech_mng_company_group_cd,
job_execution_id, creatr_id
) VALUES (
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, companyvesselrelationshipsTable);
}
public static String getDarkActivityConfirmedSql(){
return """
INSERT INTO %s.darkactivityconfirmed (
datasetversion, lrno, mmsi, vessel_name, dark_hours,
dark_activity, dark_status, area_id, area_name, area_country,
dark_time, dark_latitude, dark_longitude, dark_speed, dark_heading,
dark_draught, nextseen, nextseen_speed, nextseen_draught, nextseen_heading,
dark_reported_destination, last_port_of_call, last_port_country_code,last_port_country, nextseen_latitude,
nextseen_longitude, nextseen_reported_destination,
job_execution_id, created_by
INSERT INTO %s.%s (
dataset_ver, imo_no, mmsi_no, ship_nm, dark_hr,
dark_actv, dark_actv_status, zone_id, zone_nm, zone_country,
dark_tm_utc, dark_lat, dark_lon, dark_spd, dark_heading,
dark_draft, nxt_cptr_tm_utc, nxt_cptr_spd, nxt_cptr_draft, nxt_cptr_heading,
dark_rpt_dest_ais, last_prtcll_port, last_poccntry_cd, last_poccntry, nxt_cptr_lat,
nxt_cptr_lon, nxt_cptr_rpt_dest_ais,
job_execution_id, creatr_id
) VALUES (
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, darkactivityconfirmedTable);
}
public static String getCompanyDetailSql() {
return """
INSERT INTO %s.tb_company_detail (
datasetversion, owcode, shortcompanyname, countryname, townname,
telephone, telex, emailaddress, website, fullname,
careofcode, roomfloorbuilding1, roomfloorbuilding2, roomfloorbuilding3, pobox,
streetnumber, street, prepostcode, postpostcode, nationalityofregistration,
nationalityofcontrol, locationcode, nationalityofregistrationcode, nationalityofcontrolcode, lastchangedate,
parentcompany, companystatus, fulladdress, facsimile, foundeddate,
job_execution_id, created_by
INSERT INTO %s.%s (
dataset_ver, company_cd, company_name_abbr, country_nm, cty_nm,
tel, tlx, eml_addr, wbst_url, company_full_name,
care_cd, dtl_addr_one, dtl_addr_two, dtl_addr_thr, po_box,
dist_no, dist_nm, mail_addr_frnt, mail_addr_rear, company_reg_country,
company_mng_country, location_code, company_reg_country_cd, company_mng_country_cd, last_upd_ymd,
prnt_company_cd, company_status, full_address, fax_no, founded_ymd,
job_execution_id, creatr_id
) VALUES (
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?, ?, ?, ?,
?, ?
);
""".formatted(TARGET_SCHEMA);
""".formatted(targetSchema, companyDetailTable);
}
}

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.shipdetail.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.shipdetail.batch.entity.ShipHashEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -15,13 +16,24 @@ import java.util.List;
@Repository("ShipHashRepository")
public class ShipHashRepositoryImpl extends BaseJdbcRepository<ShipHashEntity, String> implements ShipHashRepository{
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.ship-028}")
private String tableName;
public ShipHashRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "snp_data.ship_detail_hash_json";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -37,7 +49,7 @@ public class ShipHashRepositoryImpl extends BaseJdbcRepository<ShipHashEntity, S
@Override
protected String getInsertSql() {
return """
INSERT INTO snp_data.ship_detail_hash_json(
INSERT INTO %s(
imo_number, ship_detail_hash, created_at, created_by, updated_at, updated_by
)VALUES(
?, ?, ?, ?, ?, ?
@ -47,18 +59,18 @@ public class ShipHashRepositoryImpl extends BaseJdbcRepository<ShipHashEntity, S
ship_detail_hash = EXCLUDED.ship_detail_hash,
updated_at = ?,
updated_by = ?
""";
""".formatted(getTableName());
}
@Override
protected String getUpdateSql() {
return """
UPDATE snp_data.ship_detail_hash_json
UPDATE %s
SET ship_detail_hash = ?,
updated_at = ?,
updated_by = ?
WHERE imo_number = ?
""";
""".formatted(getTableName());
}
@Override

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.shipdetail.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.shipdetail.batch.entity.TargetEnhancedEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -15,13 +16,25 @@ import java.util.List;
@Slf4j
@Repository("shipLastPositionRepository")
public class ShipLastPositionRepositoryImpl extends BaseJdbcRepository<TargetEnhancedEntity, Long> implements ShipLastPositionRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.ship-027}")
private String tableName;
public ShipLastPositionRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return null;
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -42,7 +55,7 @@ public class ShipLastPositionRepositoryImpl extends BaseJdbcRepository<TargetEnh
@Override
protected String getUpdateSql() {
return """
UPDATE new_snp.core20
UPDATE %s
SET lastseen = ?::timestamptz,
lastport = ?,
position_latitude = ?,
@ -58,7 +71,7 @@ public class ShipLastPositionRepositoryImpl extends BaseJdbcRepository<TargetEnh
in_sts = ?,
on_berth = ?
WHERE lrno = ?;
""";
""".formatted(getTableName());
}
@Override

파일 보기

@ -3,6 +3,7 @@ package com.snp.batch.jobs.shipimport.batch.repository;
import com.snp.batch.common.batch.repository.BaseJdbcRepository;
import com.snp.batch.jobs.shipimport.batch.entity.ShipEntity;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Repository;
@ -18,13 +19,24 @@ import java.util.List;
@Repository("shipRepository")
public class ShipRepositoryImpl extends BaseJdbcRepository<ShipEntity, Long> implements ShipRepository {
@Value("${app.batch.target-schema.name}")
private String targetSchema;
@Value("${app.batch.target-schema.tables.ship-001}")
private String tableName;
public ShipRepositoryImpl(JdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
@Override
protected String getTableName() {
return "t_snp_data.ship_data";
protected String getTargetSchema() {
return targetSchema;
}
@Override
protected String getSimpleTableName() {
return tableName;
}
@Override
@ -51,8 +63,8 @@ public class ShipRepositoryImpl extends BaseJdbcRepository<ShipEntity, Long> imp
protected String getUpdateSql() {
return """
INSERT INTO %s(
imo_number, core_ship_ind, dataset_version,
job_execution_id, created_by
imo_no, core_ship_ind, dataset_ver,
job_execution_id, creatr_id
) VALUES (?, ?, ?, ?, ?)
""".formatted(getTableName());
}

파일 보기

@ -22,13 +22,13 @@ spring:
hibernate:
dialect: org.hibernate.dialect.PostgreSQLDialect
format_sql: true
default_schema: t_snp_data
default_schema: t_std_snp_data
# Batch Configuration
batch:
jdbc:
table-prefix: "t_snp_data.batch_"
initialize-schema: never # Changed to 'never' as tables already exist
table-prefix: "t_std_snp_data.batch_"
initialize-schema: always # Changed to 'never' as tables already exist
job:
enabled: false # Prevent auto-run on startup
@ -49,10 +49,27 @@ spring:
org.quartz.threadPool.threadCount: 10
org.quartz.jobStore.class: org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.driverDelegateClass: org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
org.quartz.jobStore.tablePrefix: t_snp_data.QRTZ_
org.quartz.jobStore.tablePrefix: t_std_snp_data.QRTZ_
org.quartz.jobStore.isClustered: false
org.quartz.jobStore.misfireThreshold: 60000
# Kafka Configuration (DEV)
kafka:
bootstrap-servers: localhost:9092 # TODO: DEV Kafka Broker IP/PORT 설정
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
acks: all
retries: 3
properties:
enable.idempotence: true
compression.type: snappy
linger.ms: 20
batch.size: 65536
max.block.ms: 3000
request.timeout.ms: 5000
delivery.timeout.ms: 10000
# Server Configuration
server:
port: 8041
@ -99,6 +116,11 @@ app:
chunk-size: 50000 # 배치 청크 크기
schedule:
cron: "15 * * * * ?" # 매 분 15초 실행
kafka:
enabled: true
topic: tp_Global_AIS_Signal
send-chunk-size: 5000
fail-on-send-error: false
# AIS Target 캐시 설정
ais-target-cache:
ttl-minutes: 120 # 캐시 TTL (분) - 2시간
@ -110,16 +132,16 @@ app:
# Core20 캐시 테이블 설정 (환경별로 테이블/컬럼명이 다를 수 있음)
core20:
schema: t_snp_data # 스키마명
table: ship_detail_data # 테이블명
imo-column: ihslrorimoshipno # IMO/LRNO 컬럼명 (PK, NOT NULL)
mmsi-column: maritimemobileserviceidentitymmsinumber # MMSI 컬럼명 (NULLABLE)
schema: t_std_snp_data # 스키마명
table: tb_ship_info_mst # 테이블명
imo-column: imo_no # IMO/LRNO 컬럼명 (PK, NOT NULL)
mmsi-column: mmsi_no # MMSI 컬럼명 (NULLABLE)
# 파티션 관리 설정
partition:
# 일별 파티션 테이블 목록 (네이밍: {table}_YYMMDD)
daily-tables:
- schema: t_snp_data
- schema: t_std_snp_data
table-name: ais_target
partition-column: message_timestamp
periods-ahead: 3 # 미리 생성할 일수
@ -132,4 +154,4 @@ app:
# 개별 테이블 보관기간 설정 (옵션)
custom:
# - table-name: ais_target
# retention-days: 30 # ais_target만 30일 보관
# retention-days: 30 # ais_target만 30일 보관

파일 보기

@ -22,12 +22,12 @@ spring:
hibernate:
dialect: org.hibernate.dialect.PostgreSQLDialect
format_sql: true
default_schema: t_snp_data
default_schema: t_std_snp_data
# Batch Configuration
batch:
jdbc:
table-prefix: "t_snp_data.batch_"
table-prefix: "t_std_snp_data.batch_"
initialize-schema: never # Changed to 'never' as tables already exist
job:
enabled: false # Prevent auto-run on startup
@ -49,10 +49,27 @@ spring:
org.quartz.threadPool.threadCount: 10
org.quartz.jobStore.class: org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.driverDelegateClass: org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
org.quartz.jobStore.tablePrefix: t_snp_data.QRTZ_
org.quartz.jobStore.tablePrefix: t_std_snp_data.QRTZ_
org.quartz.jobStore.isClustered: false
org.quartz.jobStore.misfireThreshold: 60000
# Kafka Configuration (PROD)
kafka:
bootstrap-servers: localhost:9092 # TODO: PROD Kafka Broker IP/PORT 설정
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
acks: all
retries: 3
properties:
enable.idempotence: true
compression.type: snappy
linger.ms: 20
batch.size: 65536
max.block.ms: 3000
request.timeout.ms: 5000
delivery.timeout.ms: 10000
# Server Configuration
server:
port: 8041
@ -101,6 +118,11 @@ app:
chunk-size: 50000 # 배치 청크 크기
schedule:
cron: "15 * * * * ?" # 매 분 15초 실행
kafka:
enabled: true
topic: tp_Global_AIS_Signal
send-chunk-size: 5000
fail-on-send-error: false
# AIS Target 캐시 설정
ais-target-cache:
ttl-minutes: 120 # 캐시 TTL (분) - 2시간
@ -112,16 +134,16 @@ app:
# Core20 캐시 테이블 설정 (환경별로 테이블/컬럼명이 다를 수 있음)
core20:
schema: t_snp_data # 스키마명
table: ship_detail_data # 테이블명
imo-column: ihslrorimoshipno # IMO/LRNO 컬럼명 (PK, NOT NULL)
mmsi-column: maritimemobileserviceidentitymmsinumber # MMSI 컬럼명 (NULLABLE)
schema: t_std_snp_data # 스키마명
table: tb_ship_info_mst # 테이블명
imo-column: imo_no # IMO/LRNO 컬럼명 (PK, NOT NULL)
mmsi-column: mmsi_no # MMSI 컬럼명 (NULLABLE)
# 파티션 관리 설정
partition:
# 일별 파티션 테이블 목록 (네이밍: {table}_YYMMDD)
daily-tables:
- schema: t_snp_data
- schema: t_std_snp_data
table-name: ais_target
partition-column: message_timestamp
periods-ahead: 3 # 미리 생성할 일수
@ -134,4 +156,4 @@ app:
# 개별 테이블 보관기간 설정 (옵션)
custom:
# - table-name: ais_target
# retention-days: 30 # ais_target만 30일 보관
# retention-days: 30 # ais_target만 30일 보관

파일 보기

@ -22,12 +22,12 @@ spring:
hibernate:
dialect: org.hibernate.dialect.PostgreSQLDialect
format_sql: true
default_schema: t_snp_data
default_schema: t_std_snp_data
# Batch Configuration
batch:
jdbc:
table-prefix: "t_snp_data.batch_"
table-prefix: "t_std_snp_data.batch_"
initialize-schema: never # Changed to 'never' as tables already exist
job:
enabled: false # Prevent auto-run on startup
@ -49,10 +49,27 @@ spring:
org.quartz.threadPool.threadCount: 10
org.quartz.jobStore.class: org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.driverDelegateClass: org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
org.quartz.jobStore.tablePrefix: t_snp_data.QRTZ_
org.quartz.jobStore.tablePrefix: t_std_snp_data.QRTZ_
org.quartz.jobStore.isClustered: false
org.quartz.jobStore.misfireThreshold: 60000
# Kafka Configuration
kafka:
bootstrap-servers: localhost:9092
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.apache.kafka.common.serialization.StringSerializer
acks: all
retries: 3
properties:
enable.idempotence: true
compression.type: snappy
linger.ms: 20
batch.size: 65536
max.block.ms: 3000
request.timeout.ms: 5000
delivery.timeout.ms: 10000
# Server Configuration
server:
port: 8041
@ -77,6 +94,59 @@ logging:
app:
batch:
chunk-size: 1000
target-schema:
name: t_std_snp_data
tables:
ship-001: tb_ship_default_info
ship-002: tb_ship_info_mst
ship-003: tb_ship_add_info
ship-004: tb_ship_bbctr_hstry
ship-005: tb_ship_idntf_info_hstry
ship-006: tb_ship_clfic_hstry
ship-007: tb_ship_company_rel
ship-008: tb_ship_crew_list
ship-009: tb_ship_dark_actv_idnty
ship-010: tb_ship_country_hstry
ship-011: tb_ship_group_revn_ownr_hstry
ship-012: tb_ship_ice_grd
ship-013: tb_ship_nm_chg_hstry
ship-014: tb_ship_operator_hstry
ship-015: tb_ship_ownr_hstry
ship-016: tb_ship_prtc_rpn_hstry
ship-017: tb_ship_sfty_mng_evdc_hstry
ship-018: tb_ship_mng_company_hstry
ship-019: tb_ship_sstrvsl_rel
ship-020: tb_ship_spc_fetr
ship-021: tb_ship_status_hstry
ship-022: tb_ship_cargo_capacity
ship-023: tb_ship_inspection_ymd
ship-024: tb_ship_inspection_ymd_hstry
ship-025: tb_ship_tech_mng_company_hstry
ship-026: tb_ship_thrstr_info
company-001: tb_company_dtl_info
event-001: tb_event_mst
event-002: tb_event_cargo
event-003: tb_event_humn_acdnt
event-004: tb_event_rel
facility-001: tb_port_facility_info
psc-001: tb_psc_mst
psc-002: tb_psc_defect
psc-003: tb_psc_oa_certf
movements-001: tb_ship_anchrgcall_hstry
movements-002: tb_ship_berthcall_hstry
movements-003: tb_ship_now_status_hstry
movements-004: tb_ship_dest_hstry
movements-005: tb_ship_prtcll_hstry
movements-006: tb_ship_sts_opert_hstry
movements-007: tb_ship_teminalcall_hstry
movements-008: tb_ship_trnst_hstry
code-001: tb_ship_type_cd
code-002: tb_ship_country_cd
risk-compliance-001: tb_ship_risk_info
risk-compliance-002: tb_ship_compliance_info
risk-compliance-003: tb_company_compliance_info
ship-027: core20
ship-028: ship_detail_hash_json
api:
url: https://api.example.com/data
timeout: 30000
@ -98,6 +168,11 @@ app:
chunk-size: 50000 # 배치 청크 크기
schedule:
cron: "15 * * * * ?" # 매 분 15초 실행
kafka:
enabled: true
topic: tp_Global_AIS_Signal
send-chunk-size: 5000
fail-on-send-error: false
# AIS Target DB Sync 배치 설정 (캐시 → DB 저장)
ais-target-db-sync:
@ -116,16 +191,16 @@ app:
# Core20 캐시 테이블 설정 (환경별로 테이블/컬럼명이 다를 수 있음)
core20:
schema: t_snp_data # 스키마명
table: ship_detail_data # 테이블명
imo-column: ihslrorimoshipno # IMO/LRNO 컬럼명 (PK, NOT NULL)
mmsi-column: maritimemobileserviceidentitymmsinumber # MMSI 컬럼명 (NULLABLE)
schema: t_std_snp_data # 스키마명
table: tb_ship_info_mst # 테이블명
imo-column: imo_no # IMO/LRNO 컬럼명 (PK, NOT NULL)
mmsi-column: mmsi_no # MMSI 컬럼명 (NULLABLE)
# 파티션 관리 설정
partition:
# 일별 파티션 테이블 목록 (네이밍: {table}_YYMMDD)
daily-tables:
- schema: t_snp_data
- schema: t_std_snp_data
table-name: ais_target
partition-column: message_timestamp
periods-ahead: 3 # 미리 생성할 일수